citycrew has asked for the wisdom of the Perl Monks concerning the following question:

I have to check many lines of a file, against records in a database. As an example, I have a list of names in a plain text file, one per line. I also have names in a PSQL database field. I need to filter out name from the file against the database. There may be anywhere from 10,000 to 10,000,000 names in the file, and anywhere from 1,000 to 2,000,000 names in the database.

I am currently doing the following SQL call inside the file loop:

my $statement = "select name_field from table where name_field = '$NAM +E' limit 1"; my $sth = $dbh->prepare($statement); $sth->execute or die( $sth->errstr ); my @row_ary = $sth->fetchrow_array; if( @row_ary ){ $sth->finish; return 1; } else { $sth->finish; return 0; }

It is very slow and uses a lot of CPU. I have read over the DBD-Pg documentation to find a better way to use prepare/bind/execute statements, but couldn't fully understand it. I know I can clean up the current code (and will) but would like some direction before doing so.

Any help is much appreciated.