http://www.perlmonks.org?node_id=214238

Just thought I'd share my experience on how perl saved my butt as well as many long problem hours logging calls

Since Sunday evening 10pm till Tuesday afternoon 1pm no calls were logged into the HelpDesk system from the automated call logging management system. In total around 300 calls. On Tuesday afternoon all hell broke loose when management wanted to know why no calls were being logged. After about 30 mins we had sorted out the problem. Management then came up with this beaut.. "We want all calls that were missed logged." What did we need to do to achieve this? Extract all the records that did not have a corresponding call number associated with it, and then log all these calls manually. Each call would takes on average 5 mins to log, not including the time to extract the detail on the missed calls.

297 calls = 24.75hrs.. and they wanted it WHEN..?

Wannabe perl hacker, AcidHawk, has a flash of brilliance. (I take some battering at the office because I can only write in perl.) And did perl just fit for this job. 15 mins to build a script to extract the relevant records from the management system database(MSSQL). 30 seconds to run it and have a csv file output. 5 mins to build a script to reformat all the records in the csv file (all of which is 9 lines), into messages that could be replayed in the management system. Maybe 20sec to replay all messages. Time to have all calls automatically logged in Helpdesk system. 1.5 hrs.

All this was completed before the requested change control was authorised for us to start capturing the calls manually.

This has got to be one of the strengths of perl. The amazingly short time it takes to build something that works. This fact alone has made me extremely fond of perl. In fact some of the folk that give me a difficult time because I can only code in perl have now asked that I quickly “whack something together” that will update all these calls with the reason they were logged late.

-----
Of all the things I've lost in my life, its my mind I miss the most.

Replies are listed 'Best First'.
Re: Once AGAIN perl saved my bacon
by valdez (Monsignor) on Nov 20, 2002 at 00:47 UTC

    It was exactly one year ago when my boss fired me, because I had words with him. So I started to search a new job... time was passing by and after two months I didn't find anything... One morning a friend of mine called saying: "one developer ran away three days before submit date, deleting all code and leaving only an outdated printed version of his Java app. We must show a working demo to a ministry in three days. Would you help us?".

    I don't know a word of Java, so I suggested that I could make a fake demo using Perl...

    I'm still working with them using Perl, some italian ministry are using that webapp (the working one of course), my life changed and now I have enough spare time to stay and play in the Monastery ;-).

    Ciao, Valerio

Re: (nrd) Once AGAIN perl saved my bacon
by newrisedesigns (Curate) on Nov 19, 2002 at 21:52 UTC

    ++AcidHawk, good work.

    Perl is one of the few tools that allows you to quickly create and use a solution to one of many problems. Perl's computing speed, which many outsiders use as a sticking point in an argument against Perl, is somewhat slow. This slow-to-compute factor is outshadowed by the ability of a user (even a wannabe perl hacker) to quickly fix a problem at hand.

    AcidHawk's tale is a good argument for Perl in the workplace. I'm sure many other monks would be willing to share their "war stories" on how Perl saved them from mindless work.

    John J Reiser
    newrisedesigns.com

Re: Once AGAIN perl saved my bacon
by Anonymous Monk on Nov 20, 2002 at 05:37 UTC

    Here's the story of my first perl program...

    I was talking to a co-worker of mine and she shows me this "project" (busy work) her boss had given her - A client had sent us about 2000 Word documents - one page each, full of tables filled with various bits of data. Her boss wanted the information from about 10 different fields on every document transferred to an Excel spreadsheet.

    If that wasn't tedious enough, the Word documents were password protected and the text couldn't even be selected to copy and paste into a spreadsheet, and it wasn't possible to run a macro on it. She had been manually typing everything into the spreadsheet!

    Now, I had just started to learn perl - my first programming language really, (besides some BASIC in the 80's) but I told her I'd give it a shot.

    Since the Word docs were all created from the same template, each field was preceded by the same proprietary microsoft garbage. I opened some files in a hex editor and figured out some pretty nasty looking regular expressions to find the fields, then write them out to a csv file. I don't think it took much more than a minute for it to run through all the Word docs. Being my first program, it took me a few days to get it working well, but the project was still done 2 weeks ahead of schedule. I think my coworker ended up taking all the credit for it, but I got enough satisfaction out of creating something useful that actually worked.

    I've been hooked on perl ever since!
Re: Once AGAIN perl saved my bacon
by Ay_Bee (Monk) on Nov 20, 2002 at 11:26 UTC
    I have a similar experience, I am not a programmer, just an interested amateur. My work involves the use of a heavy duty mainframe database/sheduling system written in Cobol. All data extraction is in the form of printed reports which can take up to a year to get designed and written by the IT department if and when funding is approved. My boss had the need to combine data from an external source with data contained within the data base. Using perl, and a fair amount of hit and miss learn as I went, I was able to extract data from three separate existing mainframe reports and merge this with data from the external source. The resulting report printed using simple format commands has found its way higher up the corporate tree. Now the IT department has a budget and six month project plan to replicate my report in COBOL which took me three weeks to write in my spare time. Perl is amazingly versatile, simple when needed and free why do they bother??
    Ay_Bee -_-_-_-_-_-_-_-_-_-_-_- My memory concerns me - but I forget why !!!
Re: Once AGAIN perl saved my bacon
by Theseus (Pilgrim) on Nov 20, 2002 at 14:04 UTC
    I went through a similar situation two weeks ago at work. Our web server was being brought to its knees and the execs were starting to become needlessly worried that we were "under attack" from malicious hackers. They think someone's accessing our server over and over again rapidly to bog it down.

    I said, "give me the IIS log file and 5 minutes and I'll give you a sorted list of all the IPs visiting our site and how many time they've accessed it."

    5 minutes and 1 sorted list later, the executives' worries were dismissed, and I had again proved to my CTO(who frowns on Perl only because he doesn't know it, and neither do the other developers in my team, so if they were to fire me they wouldn't be able to maintain my code) that Perl has a valuable place in our company. If only I could get them to see the light... damn Microsoft sheep.

    -Theseus
Re: Once AGAIN perl saved my bacon
by sierpinski (Chaplain) on Sep 08, 2009 at 14:25 UTC
    We have multi-million dollar applications running on Solaris servers, and one day one of the systems went haywire. Normally it has something like 16 CPUs with 128 cores and 96GB of ram. Half of the system boards died (or so we thought) and the system just screeched to a halt. So much swapping and icsw's almost halted processing totally. We got the parts replaced and the system back up, but nowhere near soon enough to avoid hefty (in the millions) fines from the government for not having this data available.

    It turns out several of the CPUs had gone offline before and we had only lost 1 system board to cause our issue. If we had replaced the failures as they occurred, it would have never brought the system down as bad as it did. We didn't have any monitoring in place to detect failed components, but now we do. I wrote this massive monitoring script (in Perl of course) that uses the Expect module to connect to each server, run a battery of checks, and then emails a report to our group twice a day. Now we find and can fix these minor problems before they escalate into major ones, and several of the upper level executives have been briefed on my work. Still a work in progress, they are always finding new things for me to check!
       /\
      /\/\ Sierpinski