Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Re: Your Favorite Heroic Perl Story

by jimbojones (Friar)
on Jan 22, 2005 at 23:56 UTC ( #424300=note: print w/ replies, xml ) Need Help??


in reply to Your Favorite Heroic Perl Story

This may not be 'heroics', more of a Perl success story.

I did my post-graduate work at a large particle accelerator laboratory outside of Chicago. We were looking for a very small asymmetry in the number of a certain decay mode of a certain type of particle versus its anti-particle. These decay events are very rare, and we needed a few million of them to see the asymmetry. The background number of decays was in the order of trillions of events.

To sort through the decays, we built a three-level "trigger" to see the events in the detector. The first two levels, which cut out 99% of the decays, were hardware-based. The "Level 3 trigger" did software event reconstruction and pattern matching on the decays, and tagged the event as a possible decay of interest. Those decay events were written to tape.

Now, not all the events written to tape were candidates for the asymmetry measurement. Some were for other physics modes, some were for calibration of the detector, etc. However, they were all written to tape in the order that they occurred, and are interspersed on the raw data tapes. To do the data analysis, as different groups wanted to study different samples, we had to do an "offline split" of the tapes. The split was based on the information that the Level 3 trigger wrote into the event data header. The process:

  • Read 10 or so raw data tapes. Add data to disk files based on the event tag.
  • Once one tape's worth of data was on disk (for the various samples), write data to a new tape, specific to that sample.
  • Repeat for all 3000 20Gb input tapes.
The job took about 4 months of baby-sitting.

This process finished by 1997. Then the lab received more funding, and it was decided that we would run the experiment again in 1999 for more data. I was deep into my thesis by then, but my advisor asked if I could look at streamlining the "split" process to do it online in real time. With two post-doctoral fellows, we wrote a Perl-based caching scheme to do the split on the fly. Now the Level 3 software wrote the data event to a disk array. We had a Perl daemon that monitored the disk as the data files were being written. Once it knew we had a full tape's worth of data, it spawned a child process to ask for the scientists on shift to mount a tape, click a few buttons, and that data was sent to a tape based on its event type. One post-doc wrote the daemon, I wrote the tape writing job, and the 3rd guy handled some of the UI components. Took us about 3 weeks, mostly because we didn't know Perl at the time.

Perl fully saved the day, with the easy filesystem access and text handling (used to parse which data files were on disk). All told, we saved about 3000 20 Gb data tapes. As I recall, they went for $20 a pop. With what they pay grad-students and post-docs, it was 10-fold return on investment. It also saved the next set of grad students 4 months waiting for an "offline" split.


Comment on Re: Your Favorite Heroic Perl Story
Re^2: Your Favorite Heroic Perl Story
by trammell (Priest) on Jan 23, 2005 at 00:52 UTC
    Hey Jimbo! Were you on NuTeV? I was on DONUT, out PWest way. Had to do the same thing with the FNAL tape libraries. Was one of my first big Perl projects, written in Perl 4.
      Hi,

      Nope, KTeV. Wrote it in Perl 5.

      Nice to see other physicists, or ex-physicist, out here.

      Seems like a long time ago ...

      - jim

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://424300]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (4)
As of 2014-09-02 04:33 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite cookbook is:










    Results (19 votes), past polls