|Perl: the Markov chain saw|
file open problem with scheduled jobsby nmete (Novice)
|on Jul 21, 2004 at 22:46 UTC||Need Help??|
nmete has asked for the
wisdom of the Perl Monks concerning the following question:
I use perl Schedule::Cron module to schedule some subroutines. These subroutines contain file operations and the problem is particularly with the file open functions: sysopen, open.
In case of setting detach parameter ( $cron->run(detach=>1) ), so I detach the main scheduler loop from the current process (daemon mode), I can not create or write any files. When the scheduler process is not detached from the current process, there is no problem.
Both in the latter and former case, I attempt to write to the same directory.
I know the PID of the forked scheduler process, and observe that the user of the process is "nobody" (as usual with any http user)
Is this matter of some security issues or what may be any other reason to file open failure?
The cgi program runs on Apache server on solaris system, and I use perl 5.6.1.