in reply to Your Favorite Heroic Perl Story
This may not be 'heroics', more of a Perl success story.
I did my post-graduate work at a large particle accelerator laboratory outside of Chicago. We were looking for a very small asymmetry in the number of a certain decay mode of a certain type of particle versus its anti-particle. These decay events are very rare, and we needed a few million of them to see the asymmetry. The background number of decays was in the order of trillions of events.
To sort through the decays, we built a three-level "trigger" to see the events in the detector. The first two levels, which cut out 99% of the decays, were hardware-based. The "Level 3 trigger" did software event reconstruction and pattern matching on the decays, and tagged the event as a possible decay of interest. Those decay events were written to tape.
Now, not all the events written to tape were candidates for the asymmetry measurement. Some were for other physics modes, some were for calibration of the detector, etc. However, they were all written to tape in the order that they occurred, and are interspersed on the raw data tapes. To do the data analysis, as different groups wanted to study different samples, we had to do an "offline split" of the tapes. The split was based on the information that the Level 3 trigger wrote into the event data header. The process:
- Read 10 or so raw data tapes. Add data to disk files based on the event tag.
- Once one tape's worth of data was on disk (for the various samples), write data to a new tape, specific to that sample.
- Repeat for all 3000 20Gb input tapes.
|
---|
Replies are listed 'Best First'. | |
---|---|
Re^2: Your Favorite Heroic Perl Story
by trammell (Priest) on Jan 23, 2005 at 00:52 UTC | |
by jimbojones (Friar) on Jan 23, 2005 at 14:52 UTC |
In Section
Meditations