wget Command in Linux/Unix with Examples

The wget command is a command-line utility in Linux that is used to download files from the internet. It supports various protocols including HTTP, HTTPS, and FTP and can be used to download files from a website, a FTP server, or any other location that is accessible via the internet. The wget command is widely used for downloading files, especially when the download needs to be automated or performed from a script.

1) To download a single file from a website, the following command can be used:

wget http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" and save it in the current working directory.

2) To download a file and save it to a specific location, the -P option can be used.

wget -P /path/to/directory http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" and save it in the directory "/path/to/directory".

3) To download a file and save it with a different name, the -O option can be used.

wget -O newfile.txt http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" and save it as "newfile.txt" in the current working directory.

4) To download multiple files at once, the URLs of the files can be specified in a text file, one per line and the following command can be used:

wget -i list.txt

This command will read the URLs from the file "list.txt" and download the corresponding files.

5) To download files from a FTP server, the following command can be used:

wget ftp://ftp.example.com/file.txt

This command will download the file "file.txt" from the FTP server "ftp.example.com". If a username and password are required to access the server, they can be specified with the --ftp-user and --ftp-password options.

wget --ftp-user=username --ftp-password=password ftp://ftp.example.com/file.txt

6) To download files from a website using a proxy server, the -e option can be used.

wget -e use_proxy=yes -e http_proxy=http://proxy.example.com:8080 http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" using the proxy server "proxy.example.com" and port 8080.

7) To limit the download speed, the --limit-rate option can be used.

wget --limit-rate=200k http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" with a maximum download speed of 200 kilobytes per second.

8) To download files that are behind a login page, the --user and --password options can be used.

wget --user=username --password=password http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" using the specified username and password.

9) To download files that are behind a login page using cookies, the --save-cookies and --load-cookies options can be used.

wget --save-cookies cookies.txt --post-data='username=myusername&password=mypassword' http://example.com/login
wget --load-cookies cookies.txt http://example.com/file.txt

This command will first login to the website "http://example.com" by sending the specified post data, and then save the cookies in a file called "cookies.txt". Then the second command will load the cookies from the file "cookies.txt" and download the file "file.txt" from the website.

10) To download files that are behind a login page using a .netrc file, the --netrc option can be used.

wget --netrc http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" using the credentials specified in the .netrc file.

11) To download a webpage and all its linked resources, the -r option can be used.

wget -r -p -k http://example.com/

This command will download the webpage "http://example.com" and all the linked resources, such as images, css, and javascript files. The -p option will download the necessary files to display the webpage offline and the -k option will convert the links in the downloaded files to point to the local copies of the resources.

12) To download files in a specific time schedule, the --limit-rate and --limit-time options can be used.

wget --limit-rate=200k --limit-time=60 http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" with a maximum download speed of 200 kilobytes per second and will only download for 60 seconds. This can be useful if you have a limited data plan or if you want to schedule downloads at specific times.

13) To download files using a specific number of connections, the --limit-rate and --no-clobber options can be used.

wget --limit-rate=200k --no-clobber http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" with a maximum download speed of 200 kilobytes per second and will not overwrite existing files of the same name. This can be useful for resuming incomplete downloads or for downloading multiple files at the same time.

14) To download files using a specific number of connections, the --limit-rate and --no-clobber options can be used.

wget --limit-rate=200k --no-clobber http://example.com/file.txt

This command will download the file "file.txt" from the website "http://example.com" with a maximum download speed of 200 kilobytes per second and will not overwrite existing files of the same name. This can be useful for resuming incomplete downloads or for downloading multiple files at the same time.

15) To download files recursively and preserving their original structure, the -r and -np options can be used.

wget -r -np http://example.com/directory/

This command will download all the files and directories within the directory "http://example.com/directory" and preserve the original file and directory structure. The -np option will not download files from parent directories.

In conclusion, the `wget` command is a powerful and versatile tool that can be used for a variety of tasks including downloading files, entire websites, and directories. It supports various protocols, can be used with proxy servers, and can handle login pages and cookies. Additionally, it can also be used to schedule downloads, limit download speed, and retry downloads if they fail. With its range of options and parameters, `wget` can be tailored to suit your specific needs, making it an essential tool for anyone who needs to download files from the internet.