How to download a file from a website via terminal?
Open terminal and type
wget "http://domain.com/directory/4?action=AttachFile&do=view&target=file.tgz"
to download the file to the current directory.
wget -P /home/omio/Desktop/ "http://thecanadiantestbox.x10.mx/CC.zip"
will download the file to /home/omio/Desktop
wget -O /home/omio/Desktop/NewFileName "http://thecanadiantestbox.x10.mx/CC.zip"
will download the file to /home/omio/Desktop
and give it your NewFileName
name.
you can do it by using curl .
curl -O http://domain.com/directory/4?action=AttachFile&do=view&target=file.tgz
The -O saves the file with the same name as in the url rather than dumping the output to stdout
For more information
I use axel
and wget
for downloading from terminal, axel is download accelerator
syntax
axel
axel www.example.com/example.zip
wget
wget -c www.example.com/example.zip
for more details type man axel
, man wget
in terminal