Linux wget download manager

Linux wget download manager
linux wget as downloader

wget stands for “web get“. It is a command-line utility which downloads files over network.wget is a free utility for non-interactive download of files from the web. It supports HTTP, HTTPS, and FTP protocols, as well as retrieval through HTTP proxies.

wget support resume and speed limiting capabilites.

01. Installing wget :

for Redhat,CentOS,….

yum -y install wget

for Debian,Ubuntu,…

apt install wget -y

02. Overview of wget usage :

wget utiliy is full of options which makes it high level download manager, Except for GUI, It is command-line tool !!

To list all options for wget :

wget --help

To read manual page :

man wget

it comprehensive information, so we will go through common simple tasks by examples.

03. Download files using wget :

To download a file to current path with same name as it comes from the server :

wget http://example.com/file.zip

To download it to different path and name :

wget http://example.com/file.zip -O /tmp/myfile.zip

and to log output to a file rather than to screen :

wget http://example.com/file.zip -O /tmp/myfile.zip -o file.log

you can download multiple files at once :

wget http://example.com/file.zip http://example.com/file2.iso

To specify retry count before wget stop trying (for example: 10 times ) :

wget -t 10 http://example.com/file.zip

04. wget quota and rate limiting :

To limit the maximum size of files to download (useful when downloading multiple files of unknown size ), size in k,m,.. for kilobyte,megabyte,… , for example limit quota to 100m :

wget -Q 100m http://example.com/file.zip http://example.com/file2.iso

To limit internet speed used by wget for any purpose, for example limit it to 30k (30 kilobytes/second) :

wget --limit-rate 30k http://example.com/file.zip

05. wget resume and site mirroring :

To resume interrupted download , go to same path where you started it and do that :

wget -C http://example.com/file.zip

To create a clone or mirror of a site :

wget --mirror --convert-links http://example.com/

more options :

  • --mirror – Makes (among other things) the download recursive.
  • --convert-links – convert all the links (also to stuff like CSS stylesheets) to relative, so it will be suitable for offline viewing.
  • --adjust-extension – Adds suitable extensions to filenames (html or css) depending on their content-type.
  • --page-requisites – Download things like CSS style-sheets and images required to properly display the page offline.
  • --no-parent – When recursing do not ascend to the parent directory. It useful for restricting the download to only a portion of the site.

06. wget with user credentials :

if a site require login with user and password (HTTP ,, FTP sites), you can pass that data easily :

wget --user USER --pssword PASS http://example.com/file.pdf

or to hide password , let it ask for it :

wget --user USER --ask-pssword http://example.com/file.pdf

As we said at begin , wget if fully with options , take your time to explore manual page or help messages for more tasks, but that was the common tasks.

That is it , i hope it was simple, thanks for joining me.
Enjoy !.

Comments are closed.