Download several files with wget in parallel
Confiq's answer is a good one for small i
and j
. However, given the size of i
and j
in your question, you will likely want to limit the overall number of processes spawned. You can do this with the parallel
command or some versions of xargs
. For example, using an xargs that supports the -P
flag you could parallelize your inner loop as follows:
for i in {0800..9999}; do
echo {001..032} | xargs -n 1 -P 8 -I{} wget http://example.com/"$i-{}".jpg
done
GNU parallel has a large number of features for when you need more sophisticated behavior and makes it easy to parallelize over both parameters:
parallel -a <(seq 0800 9999) -a <(seq 001 032) -P 8 wget http://example.com/{1}-{2}.jpg
Here's a very simplistic approach. Limits threads to 10 in this example.
for i in {0800..9999}; do
for j in {001..032}; do
wget http://example.com/"$i-$j".jpg &
while test $(jobs -p|wc -w) -ge 10; do sleep 0.1 ; done
done
done
for i in {1..3}; do
for j in {10..20}; do
(wget http://example.com/"$i-$j".jpg &)
done
done
I even tested it...