in reply to
Fastest way to download many web pages in one go?
As an aside, at one “shop” where I was working, they had a variation of the Unix execargs command which supported pooling. It was just an -n number_of_children parameter (or something like that ...), but it sure was useful. The command worked in the usual way ... read lines from STDIN and execute a command with that line in its argument-string ... but it supported n children doing the commands simultaneously. Each child ran, did its thing, and then died. Maybe this is a standard feature ... I don't know ... but it cropped up everywhere in the stuff that they were doing, as a useful generalization. Here, you’d feed it a file containing a list of URLs and use it to drive a command that took one URL as a command-line parameter, retrieved and processed it. Since each process would actually spend most of its time waiting for some host to respond, you could run a very large number of ’em.