Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw

Re: Dirtiest Data

by ptum (Priest)
on Jun 12, 2006 at 21:22 UTC ( #554905=note: print w/replies, xml ) Need Help??

in reply to Dirtiest Data

I think that nearly every time I have agreed to parse a variably-formatted data file, I have regretted it. My current practice for handling real-world problems like this is as follows:

Customer: I have this data file from system X that I need to provision into system Y.

Me: Does it have a standardized format that every record follows, either fixed-length columns, XML, or some kind of delimited columns?

Customer: (lying through teeth) Yes, of course it does!

Some time passes while I discover his web of lies ...

Me: You weaselly liar! Your data is just a mish-mash of arbitrary values in a variety of formats! You expect me to make sense of this?

Customer: Well, it actually can come in these N variations ... except when it doesn't.

Me: OK. Tell you what. I'll write a script that will provision your data into system Y assuming it passes my validation routine. I'll write a validation routine separately, and any records that fail the validation will be displayed on a handy web report here (I specify some internal website).

Customer: OK, sounds great.

Some time passes and the customer (who never checked the web report) eventually discovers that only 5% of their data passes the validation and that they have no control over system X's output.

Customer: Hey, none of my data is being provisioned! The developer for system X says it will be 6-8 years before he can change his output. Can't you loosen up your validation routine so my data gets provisioned?

Me: No, because that would cause system Y to fail in a variety of ways, and would simply pass the buck.

Customer: Waaaah!

Having pity, I quietly find out where system X stores its data and build a script to acquire it in a standardized format. I provision the data with minimal validation problems and everyone is happy, except the programmer for system X, who is ultimately laid off when it is discovered that his system is non-essential and he is non-responsive. The end.

Seriously, it often simply exacerbates the problem when we use Perl's power to cover up and pander to sloppy upstream programming or improper data input. Getting your data clean is often a 'pay me now or pay me later' situation ... and the cost when you pay later can be astronomical.

Just an opinon, take it or leave it. :)

Replies are listed 'Best First'.
Re^2: Dirtiest Data
by freddo411 (Chaplain) on Jun 13, 2006 at 17:51 UTC
    That is a classic, classic post.

    I am particularly fond of the approach of "find out the DB the data is coming from" and go get it from there.

    Computer are so much easier to talk to than business people.

    Nothing is too wonderful to be true
    -- Michael Faraday

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://554905]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others about the Monastery: (5)
As of 2020-01-28 19:58 GMT
Find Nodes?
    Voting Booth?