|Problems? Is your data what you think it is?|
Re^2: cron script best practicesby jimbus (Friar)
|on Aug 10, 2005 at 21:35 UTC||Need Help??|
Here is my crontab:
reports@clarkkent/home/reports(7): crontab -l 00 06 * * * /home/reports/ftp/SMSC0/loadData.pl 00 06 * * * /home/reports/ftp/MMSC1/loadData.pl 20,35,50,05 * * * * /home/reports/ftp/YTSMSC50/loadData.pl 20,35,50,05 * * * * /home/reports/ftp/FDSMSC/loadData.pl 00 06 * * * /home/reports/ftp/proptima/ftp.pl
loadData.pl is the script I'm checking on ("ps -ef|grep loadData|wc -l"). The first two should be running once a day at 6am and the second two every fifteen minutes... which is 96 times each per 24 hour period. I'm assuming the issue is with the second two, which are the same but for different boxes.
These scripts digest a log file that is a series of reports from about 12 nodes, each one has between 50 and 225 key and value pairs, one per line. I loop through the nodes, building a hash of the key/value, then build a huge insert based from them... with upto 225 columns, the insert is built dynamically.
I have filled /usr a couple times, once recently. I thought things would recover, but I end up with all these processes and mysqld running at 60-70% of cpu.
I guess the real thing is I'm resource strapped and perl inexperienced and getting a bit overwhelmed by the amount of data being chucked at me and was hoping to find someone who had documented what it took to write mature cron/logging scripts :) With Perl and JDBC for JSP, I find all kinds of simplest case stuff on the web, but not a lot on what I would think would be typical useage patterns.
Never moon a werewolf!