Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Custom DBD error message

by packetstormer (Monk)
on Nov 12, 2012 at 15:59 UTC ( #1003472=perlquestion: print w/ replies, xml ) Need Help??
packetstormer has asked for the wisdom of the Perl Monks concerning the following question:

Hello Monks

I have a small Perl script that parses a csv file, builds a hash and then, depenging on certain things, inserts some/all of the data into a MySQL table.
I have a unique, multi-column index, on the SQL table because I know the end user will run this script over the same data multiple times and I need to make sure we don't get multiple inserts of the same data

My problem is that, although this all works fine, the message sent back to the end user, from the script, is something like:

DBD::mysql::st execute failed: Duplicate entry '167423-192.168.160.153-b8:12:xx:yy:zc-2012-11-10 11:21:43' for key 'idx_unique_session

While this is fine for me, the end user won't have a clue what is going on!
Does anyone have any tricks on how to trap this error, compare it and if true replace it with a user friendly message e.g
"Duplicate data inserted, have you processed this file already" etc.
Thanks

Comment on Custom DBD error message
Download Code
Re: Custom DBD error message
by mje (Deacon) on Nov 12, 2012 at 16:04 UTC

    Use HandleError. See DBI docs. You can set it to a sub which gets called when an error occurs. You can choose to ignore the error or even change the text.

Re: Custom DBD error message
by sweetblood (Parson) on Nov 12, 2012 at 19:10 UTC
    The problem is not the error, and not necessarily your code, although it seems odd that you say you are loading the contents of the csv file into a hash and are getting dupes. This is a data issue, if you expect to be passing data to mysql table and you have defined keys, they must be unique. Now you can de-dup your data by loading the data into a hash with keys that are the same as you are using in your table which means you will lose records that may or may not matter, very hard to tell since you have been so vague as to what you are doing. If you need all the records then you must determine what makes a unique record and build you keys accordingly.

    Sweetblood

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1003472]
Approved by mje
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chanting in the Monastery: (11)
As of 2014-08-27 14:04 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    The best computer themed movie is:











    Results (238 votes), past polls