Come for the quick hacks, stay for the epiphanies. | |
PerlMonks |
comment on |
( [id://3333]=superdoc: print w/replies, xml ) | Need Help?? |
On my last project, we were using a product that generated code consisting of between five and fifty files, FTP'd them to a mainframe, and then (theoretically) compiled and ran them. The FTP connection was as flaky as the east coast, and kept disconnecting at random points. With larger jobs, you could literally sit there clicking Send for an hour without ever getting all the files to transfer.
I wrote a script that would decipher the control file that determines what files get sent where, open the FTP connection, and send the files. If any error (other than password rejects) occurred, it would disconnect, reconnect, and carry on from the file that had failed. No more "fail on the fifth file, retry, fail on the second file", etc. The script, the first version of thich was less than 50 lines long and took under two hours to write, saved literally thousands of hours of developer time, which would have killed the project. It also significantly improved Perl's reputation in the team. (Perhaps more honestly, if I hadn't written this script, something else would have probably been done about the FTP connection, but we would have suffered for a couple more months at least). Later on in the project, I upgraded the script to send from different environments, to send multiple sets of source code, generate composite compilation jobs, scan for lines longer than 72 characters, pull out all the embedded SQL statements into files for DBA optimisation, and a few other time-savers. Historical note: at the time this comment was posted, the east coast of the US was in the middle of a rather severe blizzard, hence the "flaky" quip. In reply to Re: Your Favorite Heroic Perl Story
by PhilHibbs
|
|