Beefy Boxes and Bandwidth Generously Provided by pair Networks Bob
Keep It Simple, Stupid
 
PerlMonks  

Re: Executing long perl scripts (>2h) in a browser

by jpl (Monk)
on Jun 28, 2011 at 17:53 UTC ( #911815=note: print w/ replies, xml ) Need Help??


in reply to Executing long perl scripts (>2h) in a browser

This may be more of a web server problem than a perl problem. We encountered something similar when the perl scripts we were running (successfully) on one web server started failing after we moved to a different server. In our case, it turned out that the new server set some resource limits that were perfectly reasonable for ordinary interactive requests, but too restrictive for "batch" jobs we spawned in response to requests submitted to the server. The original requests completed in a timely fashion, but the batch jobs inherited the limits, and either timed out, or exceeded a file size limit. Our admins were willing to lift the limits, but the existing limits were useful for preventing accidental resource hogging. So what we did instead, was to

  1. Modify the server source to make the limits soft instead of hard, so they could be raised on a process-by-process basis, and
  2. Invoke the batch jobs via a "wrapper" that removed the limits before executing the batch jobs.
This may or may not be what is behind your timeouts, and you may or may not be able to raise the limits if that is the problem. See the manual pages, if any, for getrlimit and setrlimit. In my environment, with very cooperative system administrators and access the the server source (I think they even "bought back" making the limits soft via a configuration parameter), this worked perfectly.

update It's starting to come back to me now. What we changed was CGIWrap. See http://cgiwrap.unixtools.org/changes.html, New in version 4.0:, option --with-soft-rlimits-only


Comment on Re: Executing long perl scripts (>2h) in a browser
Re^2: Executing long perl scripts (>2h) in a browser
by jpl (Monk) on Jun 29, 2011 at 10:45 UTC

    We can return this discussion to the perl domain by asking what a perl script can do if it finds itself with restricted, but modifiable, resource limits. The short answer is BSD::Resource from Jarkko Hietaniemi. It's a nice interface to the getrlimit and setrlimit calls (and a few others that cannot be invoked as perl builtins).

    Resource limitations have to be addressed as subroutine calls, not by something invocable with system(). A child process can (try to) modify its own limits, but it cannot modify the limits of its parent.

      Thanks a lot for all your suggestions!

      I checked the indicated resources & the "design" suggestions and came out with the following solution: the web page shall be used just to call my script, using the Win32::Process, with the DETACHED_PROCESS option.

      Thanks again, i.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://911815]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others musing on the Monastery: (7)
As of 2014-04-17 02:04 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    April first is:







    Results (437 votes), past polls