Matching 50,000 anything against a database is going to be a pain. I would do one select though, draw the whole table into a hash of ids, then compare your array to it. This trades off memory for disk and CPU time; for this much data it is probably worth it, but not a good idea when the table contains millions of records.
in reply to Comparing an array to a DBI table
Any faster scheme would depend on properties of the table; for instance, if the user ids are integers and have longish sequences of values with no gaps, then you could store the start and end of sequences and so on.