http://www.perlmonks.org?node_id=542770

DigitalKitty has asked for the wisdom of the Perl Monks concerning the following question:

Hi all.

I was wondering if you (yes, you) might regale us with an example of how perl was used in combination with a pipe to creatively / cleverly solve a problem and / or construct an interesting project. Perhaps you work in a scientific laboratory and your program 'imports' research data in real-time or something similar. Perl has been described as a 'glue language'. What have you managed to synergistically connect?

Thanks,
-Katie

Replies are listed 'Best First'.
Re: Perl and Pipes. Share your story.
by tbone1 (Monsignor) on Apr 12, 2006 at 12:19 UTC
    One of the first "official" uses I had for Perl was inpsired by US government bureaucracy. A few years ago, someone at the hindquarters of a government agency to which I had contracted had decided that they would measure productivity by how many lines of code we wrote. Brilliant! (Include clinking of Guinness glasses here.) After lobbying, my boss pointed out that often reducing lines of code was more efficient, so we could get credit for that, too. So I wrote a Perl script that, as part of a check-in/check-out set of shell scripts, expanded C/C++ code to multiple lines, or collapsed it to one line (barring HERE documents), or added GNDN code (sleeps, useless for loops, "if (1 == 0)" blocks, etc).

    Naturally, I wasn't the only one to have thought of this; my boss had as well and we got together to combine all our tricks of the trade. We were suddenly producing more code than the other 200 people in the lab combined because we were expanding and compressing code like a g(un)?zip command gone rogue. When the folks at hindquarters discovered this, and why, they abandoned the idea.

    In some ways, it was my proudest moment: I actually got US government bureaucrats to admit, albeit implicitly, that they had made a mistake because they didn't understand something. It was tantamount to getting a marketing executive to admit he doesn't know how to do anything but BS and play golf.

    --
    tbone1, YAPS (Yet Another Perl Schlub)
    And remember, if he succeeds, so what.
    - Chick McGee

Re: Perl and Pipes. Share your story.
by BrowserUk (Patriarch) on Apr 12, 2006 at 10:08 UTC

    Not Perl. When the idea to use Perl for the project was mooted, I asked to see some examples of Perl code and they gave me three shortish (Perl 4) scripts to look at. I spent no more than 5 minutes reading them and rejected the idea out of hand as write-only code. I'm not sure if they were particularly bad examples of the Perl 4 art-form or not. I just knew that I could make some sense of the source code for most languages within a few minutes and this stuff looked like nothing I had ever seen before. The longer I stared at it, the less sense it made. Now I understand a little about how Perl works, I know that I would much, much, much rather do the project in Perl (5. I still know nothing about Perl 4), than the mish-mash of shell script (csh), awk and REXX that we ended up doing it in, but I had to make a decision, quickly, and I said no.

    The project was big, 600 servers servicing 40,000 PCs, and we were in the pilot phase the project under EU Tender rules. It was critical to the project that we met the target dates and all of the agreed (and exhaustively documented) pilot phase criteria. Any failures, no matter how insignificant to the overall project goals, or understood and forgiven by the client, risked the whole project being scrapped and having to re-enter the complex tender procedure. It wasn't enough to have the client accept and sign-off any failures or omissions to prevent that. It would also have been necessary to have all our competitors, that we had beat out at the tendering stage, sign-off on those failures.

    The theory being that if we won the tender on price, but then came up short of the specifications, then the tender process could be unfair. If our competitors had know that they would not have to meet certain aspects of the specification, then they might have been able to reduce their price and might therefore, have won over us. It was critical that we did not give our competitors that possibility.

    The application, running on 3 of the 600 servers for the pilot, was pushing out updates to approx 1000 desktop PC located across 8 buildings with 4 to 10 floors each. The criteria called for "unattended operation" and "95% successful completion". Basically, a diskette was delivered by internal mail to each PC owner and they were instructed to power down their (LAN connected) PCs and leave the diskette in the drive A. At the alloted time (the early hours of Sunday morning usually), the Server was given a list of Ethernet Addresses and it would send a Wake-on-LAN sequence to cause them to boot the floppy, and that would initiate the upgrade. The servers would serve the updates on demand and perform extensive logging of all the details of the transactions to a central logging server.

    The problem was, machines would fail to boot because the floppy was missing or corrupt; or the user had initiated shutdown, switched off the screen and gone home leaving an application asking for close or save confirmation; or a myriad of other, trivial but show stopping "human errors". The contract allowed for support personal to manually intervene to rectify such human errors, but with so many machines spread across so many floors of so many buildings; and the only allowed monitoring point, what was effectively a tail -f on the central logging server; it became almost impossible to keep track of how machines were progressing; which ones were stalled; and where those machines were located.

    The suppliers of the software we were using proposed a 3-month, $6-digits solution to the problem, but we only had 10-days.

    My (our) solution, was to insert tee between the logging daemon and the file, and then build an application that used pipes, sort, awk and ANSI escape sequences to display a real-time blow-by-blow "event log" of the transactions as they progressed. Sorting the display by the timestamps allowed us to place those taking the longest at the top of the display, and when one failed to respond and complete any given step in a timely manner it would filter to the top.

    Armed with the knowledge of the failing machine's Ethernet address, we still had the problem of locating the physical machine--the inventory tag records were years out of date with machines have been moved; NIC cards swapped; and a myriad of other unrecorded changes to the official recorded position.

    The solution to that proved to be remarkable simple. We discovered that by updating the router tables to block inbound TCP packets from the failing machines, it caused a network quality monitor DD, installed years before, to recurrently display an error message on the machine. A part of that message contained some "\a"s. It is surprising how far those simple beeps travelled in an empty office building at 2 or 3 in the morning!

    We positioned a support guy every couple of floors of each building and they had only to listen for the beeps. Tracking them to source and then doing whatever it took to get that machine up and running again.

    It didn't help with every failure--hung machines or failures to boot etc.--but it allowed us to find and fix enough of the failures to comply with the 95% rules without incurring a huge and untimely re-development cost; or breaking the non-interactivity clauses of the unattended operations missive.

    The whole thing was a shell pipeline of logapp | tee logfile | sort ... | awk. An amazingly simple and powerful concept.


    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.
Re: Perl and Pipes. Share your story.
by MCS (Monk) on Apr 12, 2006 at 12:59 UTC

    One of my first major achievments in Perl was to take a number of recipes (as in cooking) that my uncle had typed up into a word file and pipe the output of antiword to my custom perl script which parsed the recipes and put them in my database. (Antiword just takes a word file and outputs it in a text-only format) Sure, it's probably not the best example but it was quick, it was easy and got me hooked on Perl.

Re: Perl and Pipes. Share your story.
by liverpole (Monsignor) on Apr 12, 2006 at 18:17 UTC
    My previous company built audioconferencing equipment, and, as toolsmith for the SQA group, I had written programs to automatically schedule large numbers of audioconferences, both to test the scheduling APIs, and to setup overnight stress tests of the audioconferencing equipment.

    When the company decided to move away from its flat-file database system to using Informix and SQL, I knew these tools would have to be rewritten to schedule conferences using the new database.  Furthermore, a colleage and friend of mine was facing the onerous task of having to write a testplan for the quality-assurance of the new database, and, as he and I both had never done any SQL programming, he wasn't sure where to begin.

    So I created a set of database interface scripts in Perl, as well as a couple of common modules, as an easy way of organizing reusable sets of data, so that we could create hundreds of thousands of conferences on-the-fly, for testing both the perfomance of the new database, and the audioconferencing bridge itself.  A number of the scripts would pipe data between themselves; for example, the script that stored conferencing parameters in "bulk-load" format had to pipe its data to the script which called the stored procedures for actually loading that data into the database.

    Even the consultant whom we had hired to design the schema for the new database was impressed with how quickly Perl allowed new sets of functionality to be folded into the test environment.  The project, which had started as a simple way of helping me do my job (and helping my colleage get a handle on how to design his testplans) ended up becoming a major project, an informal product for use by a number of customers, and an integral part of the SQA group's suite of tools.


    s''(q.S:$/9=(T1';s;(..)(..);$..=substr+crypt($1,$2),2,3;eg;print$..$/
Re: Perl and Pipes. Share your story.
by DrHyde (Prior) on Apr 12, 2006 at 09:31 UTC
Re: Perl and Pipes. Share your story.
by jesuashok (Curate) on Apr 12, 2006 at 10:10 UTC
    Hi

    Still I remember my First Project, which I have developed using c and perl.
    Actuall that project was developed for Telebanking Domain.
    In that project, If you see we have used Unix pipes very effectively.
    we have an compiled Object of a C code and we have a separate perl script.
    The C program executables directly talks with the hardware, which we have used for our telebanking project. But perl code was not developed to talk with the hardware.
    In that scenario we have used unix pipes.
    perl passes all the commmands in pipe. C reads all the commands from pipe and pass it to hardware.
    In this way pipe acts as a interface between C program and perl program. really that software is good.

    "Keep pouring your ideas"
Re: Perl and Pipes. Share your story.
by mattr (Curate) on Apr 13, 2006 at 03:15 UTC
    I don't know if the Gimp's perl-fu server counts as a pipe or not. But a few years ago I had a job dropped in my lap, to make a thousand page website with some thousands of photos in it. The website would show photo galleries online of interiors for the various hotels of a hotel group, but they only had two weeks to do it, and it had to be a static site - no databases. And they would have last minute changes.

    I took a week or so to figure out perl-fu and crunch some page templates, making a perl program that would pull photos off a CD, do lots of photo processing in the Gimp, and add code into templates for things like mouseover highlighting of thumbnails etc. The processing was tested using the a perl shell utility and copying the work, like for adding a frame with alpha channel masking, into the program. Also I decided to make the thumbnails rotate in a circular fashion as you surfed through the site, that wasn't too hard either. The entire thing culminated into a run of 5 minutes for photo processing and a few more minutes to write out a thousand static web pages.

    The interesting part was watching the run. The Gimp would open photos in windows, resize them and close them with blinding speed as if a phantom graphic designer was sitting at the keyboard. I picked my jaw off the floor and called the system Magic Hands. Haven't had much use for it since then (though proposed it to someone recently), but it proved to be very useful at the time when as I had predicted last minute changes came in that would have been impossible otherwise. Another 5 minute run and those changes were rolled into the site easily. Call it self fulfilling but it proved to me again that Perl was not just glue but also an enabling tool. When I bet on Perl it seldom lets me down.

Re: Perl and Pipes. Share your story.
by izut (Chaplain) on Apr 12, 2006 at 22:47 UTC

    As someone said, if glueing is in the game, I can tell you my story :) We are using a Perl application to read Postfix' log file 'on-the-fly' to extract data like user id of sender, size of message, etc, and then execute some tasks like throw processed data to MySQL for billing and blocking - if the number of messages sent by an user is suspicious and seems to be spamming.

    Another nice use is reading Tarantella's login.log file and check when some user logged in, to control the number of allowed clients (and kill the session if it exceeded) since the product doesn't do that.

    Igor 'izut' Sutton
    your code, your rules.

Re: Perl and Pipes. Share your story.
by punkish (Priest) on Apr 12, 2006 at 19:10 UTC
    pipe no, glue yes (does sniffing glue in a pipe count?). This is what I did --

    «emailed job request»
           |
           |
           V
    «MS-Exchange Server»
           |
           +-------------> our favorite language
           |
           V
    «MS-SQL Server»
           |
           |
           V
    «Permit tracking application»
    
    --

    when small people start casting long shadows, it is time to go to bed
Re: Perl and Pipes. Share your story.
by EvanCarroll (Chaplain) on Apr 12, 2006 at 21:38 UTC
    Loaded up pgsql with a pipe just today. Perl csv-preprocessor!
    ./col_proc.pl ./Models.txt 1,5..8 | psql -d vin -c '\copy chrome.model + (id,fkey_subdivision,name,date_effective,comment) FROM STDIN CSV QUO +TE AS '~''

    ./col_proc delivers the functionality of filtering certain columns out of a csv, because postgres lacks that functionality nativaly any utilized columns must be located at the end of the csv.
    Ie if your csv has (foo,bar,baz) and your sql table has (bar,baz) you can't selectivly take it from the csv. So I made ./col_proc.pl to only take columns asked for explicitly.


    Evan Carroll
    www.EvanCarroll.com
Re: Perl and Pipes. Share your story.
by rvosa (Curate) on Apr 14, 2006 at 05:06 UTC
    Here's something about pipes in genome analysis.
      That might be the (perl) understatement of the week! This is not meant as a criticism of rvosa by any means because it is a great example of an application of perl and pipes.

      However, the link is to a reprint of a very nice article from The Perl Journal (RIP) entitled "How Perl Saved the Human Genome Project".

      "Something about pipes" indeed!

Re: Perl and Pipes. Share your story.
by fizbin (Chaplain) on Apr 16, 2006 at 00:12 UTC

    I'm constantly using perl one-liners in pipes when investigating things; as such, the usage doesn't usually get written down and I can only come up with a few examples in my bash_history file:

    svn log -vv -r 21224 | perl -nle 'm|Eclipse/(.*)| and print$1'> /tmp/f +ilesChanged.txt
    A simple checklist of stuff to look at generated by a certain subversion commit.

    Then there's this longer example that requires some setup. See, we have in our source code a bunch of DTD files that are replicated into several different directories. Yeah, we should keep them in one central location, and we're moing there, but right now we don't. Anyway, occasionally a DTD file changes, and the changes are copied to its various versions, but every once in a while we miss something. That's where this usage of perl in the midst of pipes comes in.

    find * -name '*.dtd' | xargs md5sum > /tmp/dtdmd5s.txt perl -ple 's{([^*]*) [*](.*/)([^/]*)$}{$2$3 $3 $1}' /tmp/dtdmd5s.txt | + \ sort -k2 | uniq -c -f 1 | \ perl -nle '@a=split;if($a eq $a[2]){print $b;print}$b=$_;$a=$a[2]'
    The output that produces is nothing when all the copies of a given dtd file are in sync. When they aren't, I get something like this: (names tweaked to protect... um ... someone)
    1 clients/megacorp/cmeta/smeta.dtd smeta.dtd 81e2470da0cc593e114 +6556d0fe0ebed 41 base/classes/com/mycomp/dtd/smeta.dtd smeta.dtd e287f8786ecf79 +06f3019d3cc9accff7
    This tells me that there are 42 copies of this dtd in our source, and 41 of them agree with the dtd stored under "base", but the one for megacorp is different. I then look manually at the two versions. I keep the md5sums in a temp file because sometimes it isn't just 1 spot that has the rogue dtd, but 2 or 3, and I can just grep through the temp. file to find the locations of the rogue dtd.
    --
    @/=map{[/./g]}qw/.h_nJ Xapou cets krht ele_ r_ra/; map{y/X_/\n /;print}map{pop@$_}@/for@/