Parallel shell loops

Posted by brubelsabs on Super User See other posts from Super User or by brubelsabs
Published on 2010-06-29T15:09:28Z Indexed on 2010/12/27 22:56 UTC
Read the original article Hit count: 162

Filed under:
|
|
|

Hi,

I want to process many files and since I've here a bunch of cores I want to do it in parallel:

for i in *.myfiles; do do_something $i `derived_params $i` other_params; done

I know of a Makefile solution but my commands needs the arguments out of the shell globbing list. What I found is:

> function pwait() {
>     while [ $(jobs -p | wc -l) -ge $1 ]; do
>         sleep 1
>     done
> }
>

To use it, all one has to do is put & after the jobs and a pwait call, the parameter gives the number of parallel processes:

> for i in *; do
>     do_something $i &
>     pwait 10
> done

But this doesn't work very well, e.g. I tried it with e.g. a for loop converting many files but giving me error and left jobs undone.

I can't belive that this isn't done yet since the discussion on zsh mailing list is so old by now. So do you know any better?

© Super User or respective owner

Related posts about bash

Related posts about shell