Parallel shell loops
A makefile is a good solution to your problem. You could program this parallel execution in a shell, but it's hard, as you've noticed. A parallel implementation of make will not only take care of starting jobs and detecting their termination, but also handle load balancing, which is tricky.
The requirement for globbing is not an obstacle: there are make implementations that support it. GNU make, which has wildcard expansion such as $(wildcard *.c)
and shell access such as $(shell mycommand)
(look up functions in the GNU make manual for more information). It's the default make
on Linux, and available on most other systems. Here's a Makefile skeleton that you may be able to adapt to your needs:
sources = $(wildcard *.src) all: $(sources:.src=.tgt) %.tgt: %.src do_something $< $$(derived_params $<) >$@
Run something like make -j4
to execute four jobs in parallel, or make -j -l3
to keep the load average around 3.
I am not sure what your derived arguments are like. But with GNU Parallel http:// www.gnu.org/software/parallel/ you can do this to run one job per cpu core:
find . | parallel -j+0 'a={}; name=${a##*/}; upper=$(echo "$name" | tr "[:lower:]" "[:upper:]");
echo "$name - $upper"'
If what you want to derive is simply changing the .extension the {.} may be handy:
parallel -j+0 lame {} -o {.}.mp3 ::: *.wav
Watch the intro video to GNU Parallel at http://www.youtube.com/watch?v=OpaiGYxkSuQ
Wouldn't using the shell's wait
command work for you?
for i in *
do
do_something $i &
done
wait
Your loop executes a job then waits for it, then does the next job. If the above doesn't work for you, then yours might work better if you move pwait
after done
.