How to download multiple urls using wget using a single command?

First create a text file with the URLs that you need to download. eg: download.txt

download.txt will as below:

http://www.google.com
http://www.yahoo.com

then use the command wget -i download.txt to download the files. You can add many URLs to the text file.


From man wget:

2 Invoking
By default, Wget is very simple to invoke. The basic syntax is:
wget [option]... [URL]...

So, just use multiple URLs:

wget URL1 URL2

Or using the links from comments:

$ cat list.txt
http://www.vodafone.de/privat/tarife/red-smartphone-tarife.html
http://www.verizonwireless.com/smartphones-2.shtml
http://www.att.com/shop/wireless/devices/smartphones.html

and your command line:

wget -E -H -k -K -p -e robots=off -P /Downloads/ -i ./list.txt

works as expected.


If you have a list of URLs separated on multiple lines like this:

http://example.com/a
http://example.com/b
http://example.com/c

but you don't want to create a file and point wget to it, you can do this:

wget -i - <<< 'http://example.com/a
http://example.com/b
http://example.com/c'