Beefy Boxes and Bandwidth Generously Provided by pair Networks Joe
Perl-Sensitive Sunglasses
 
PerlMonks  

Re^2: cleanup a cancelled CGI script

by zebedee (Pilgrim)
on Jun 08, 2004 at 15:58 UTC ( [id://362520]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: cleanup a cancelled CGI script
in thread cleanup a cancelled CGI script

I think (on Windows, anyway) that you'd have to have a sweeper (reaper?) process that comes along every hour or so looking for ZIP files that are older than, say, one hour, and assume that they are incomplete, and delete them. Or you could add them to a database, or send them to another process, whose job it is delete any ZIPs in its list (or database) that are older than x hours.

Replies are listed 'Best First'.
Re^3: cleanup a cancelled CGI script
by zakzebrowski (Curate) on Jun 08, 2004 at 16:06 UTC
    ++ That's what I would've recommeneded. (Or, similarly, use one of the Win32::Process and on startup, put pid into database, then when pid dies, have a second process kill the pid related to the .zip file...)


    ----
    Zak - the office
Re^3: cleanup a cancelled CGI script
by cLive ;-) (Prior) on Jun 08, 2004 at 17:22 UTC
    Why not just do it in the script when it runs?
    my $tmp_dir = "/path/to/tmpdir"; opendir(TMPDIR,$tmp_dir) || die $!; -M "$tmp_dir/$_" > 1/24 and unlink "$tmp_dir/$_" for (grep /zip/, read +dir(TMPDIR)); closedir(TMPDIR);
    or similar (untested).

    cLive ;-)

      That's going to result in some concurrency problems.
        In what way? All I'm doing is removing tmp files over an hour old?!?

        cLive ;-)

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://362520]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.