Write output of wget or curl to a custom filename based on the url
curl
has the -o
, --output
option which takes a single argument indicating the filename output should be written to instead of stdout
. If you are using {}
or []
to surround elements in the URL (usually used to fetch multiple documents), you can use #
followed by a number in the filename specifier. Each such variable will be replaced with the corresponding string for the URL being fetched. To fetch multiple files, add a comma-separated list of tokens inside the {}
. If parts of the URLs to be fetched are sequential numbers, you can specify a range with []
.
Examples:
curl http://www.abc.com/123/{def}/{ghi}/{jkl}.mno -o '#1_#2_#3.mno'
Note the quotes around the option argument (not needed unless the the filename starts with one of the expanded variables).
This should result in the output file def_ghi_jkl.mno
.
curl http://www.abc.com/123/{def}/{ghi}/{jkl,pqr,stu}.mno -o '#1_#2_#3.mno'
This should result in the output files def_ghi_jkl.mno
, def_ghi_pqr.mno
and def_ghi_stu.mno
.
curl http://www.abc.com/123/{def}/{ghi}/[1-3].mno -o '#1_#2_#3.mno'
This should result in the output files def_ghi_1.mno
, def_ghi_2.mno
, def_ghi_3.mno
.
wget
has a switch -O
(long form --output-document
) which allows you to specify the name of the file to save to. (Presumably curl has something similar.) So you could do:
wget -O def_ghi_jkl.mno http://www.abc.com/123/def/ghi/jkl.mno
and it will do what you want.
You could probably create a wrapper around wget if you want to automate this naming scheme, but it'd be pretty hard to get bullet-proof and is definitely out of the scope of this answer. (The simple case of a single file downloaded from an explicit URL shouldn't be very hard to get right, but that's not wget's only mode of operation. To name just one case that makes this slightly non-trivial, you can specify multiple URLs on the command line.)
Note that -O
is not the same at all as -o
, which write's wget's own output to the named file.
Here is Some Bash Substitution trick
link="http://www.abc.com/123/def/ghi/jkl.mno"
OutputFile=$( echo ${link:23: 23}| tr "/" "_" )
echo $OutputFile
def_ghi_jkl.mno
{$link:23: 23}
will remove "http://www.abc.com/123/" it is ${parameter:offset:length}
, then tr
will replace /
to _
.
So now you can easily use with wget or curl
wget $link -O $OutputFile
Also we can use awk , this will extract last three filed from input string :
OutputFile=$( echo $link | awk -F/ 'BEGIN{OFS="_"}{ print $( NF-2),$(NF - 1 ),$NF}' )