logo

How to use wget command like a pro in linux terminal

Wget is an amazing command line utility that can be used for scraping the web pages, downloading videos and content from password protected websites, retrieve a single web page, mp3 files etc. It also works in slow network condition and has pause functionality, so that it automatically resume from where it was left. In this post, I'll describe how you can use wget command like a pro using some useful examples. ​

1. Wget to download a single file

$ wget http://myexample.com/s.tar.gz

2. Download a html page and save it with different name

$ wget -o page.html http://example.com/somepageurl
# Save it in different directory 
$ wget --directory-prefix='./home/user/' johndoe.com

3. Download multiple files with different protocols

$ wget http://example.com/myfile.tar.gz ftp://42.11.23.4/file.jpg

4. Limit bandwidth rate of a file

$ wget --limit-rate=20k http://example.com/myfile.zip

5. Download a webpage with its all assets

$ wget --page-requisites --convert-links --adjust-extension http://example.com/mywebpage

6. Resume a currently download file from where it was left

$ wget -c http://example.com/myfile.rar

7. Download all urls from the text file

$ wget ‐‐input long-list-of-urls.md

8. Mirror entire websites (all its pages and assets)

$ wget --mirror --no-parent --continue http://example.com

9. Download specific type of files from the website

# will download all the mp3 files
$ wget --level=2 --recursive --accept mp3 http://example.com
# will download all jpeg files
$ wget ‐‐level=1 ‐‐recursive ‐‐no-parent ‐‐accept jpg,JPG http://example.com/

10. Download files from password protected websites

wget ‐‐http-user=johndoe ‐‐http-password=somepass http://example.com/secretpath/file.tar.gz