http://www.perlmonks.org?node_id=11112122


in reply to Re^4: DBD::Oracle::st fetchrow_hashref failed: ORA-25401
in thread RESOLVED - DBD::Oracle::st fetchrow_hashref failed: ORA-25401

I had a similar issue with a "notes" field, and a workaround that seemed to work well was to convert the field and cap it:
SELECT convert(VARCHAR(8000), a.Notes) as Notes ...
In my case, Notes was a VARCHAR(MAX) data type, and my error message was "out of memory". (My database SQL Server, driver Sybase). You might try this or see what happens if you use:
convert(VARCHAR(1), a.Notes)
That would at least tell you if the error is coming from some crap data in that field, assuming the crap data isn't the first character. The above is "worth a shot" but I wouldn't be surprised if it doesn't reveal the problem.

Replies are listed 'Best First'.
Re^6: DBD::Oracle::st fetchrow_hashref failed: ORA-25401
by perldigious (Priest) on Feb 04, 2020 at 19:57 UTC

    Thanks, TieUpYourCamel. This was a great suggestion. I have to use CAST instead since my employer's Oracle DB doesn't recognize convert, but it seems to fix the issue.

    CAST(note as CHAR(30)) as note

    Looking at all the unique values, the country is always contained in the first 30 characters anyway, so I'm throwing away the other junk this way and it doesn't seem to ever hang anymore. :-)

    Just another Perl hooker - My clients appreciate that I keep my code clean but my comments dirty.