If you have something which takes a digital computer “20 minutes or so” to do, then what you’ve really got is a batch job. And so, you need to select a batch-processing system to do it. There are many to choose from. They aren’t specific to Perl. Many are free. Some are very full-featured, handling clusters, checkpoint/restart, and so on, and are open-source.
Or, you could “roll your own,” using cron jobs and so-forth. A database table can serve as the queue: the web-site user adds a new entry to this queue and, through the website, can observe the progress of the job and can collect the results when the job ends. He can go to lunch in the meantime. Also, the system is arranged such that, no matter how many jobs are submitted, they are always processed at a predictable and sustainable rate. The user could submit 30 jobs all at once, notice that the system is running them (say) three at a time, and know when to check back to see which ones have completed and which ones are still yet-to-do. No matter how big the backlog gets, the system continues to crank through them at what has been pre-determined to be the fastest sustainable rate, allowing a predictable level-of-service to be maintained.