If fetching 1000 rows with DBI takes 1 second, and comparing 1000 rows using Data::Compare takes a half-second, and then you sleep for two seconds to avoid grinding your network and servers to a halt, it will take you about 40 days to compare a billion rows (if my guesstimations are anywhere near correct). If fetching 1000 rows only takes half that long, and comparing only takes half that long, and you sleep for only one second between iterations, you're down to 20 days. :)
DBIx::Compare exists for some databases. It's probably not guaranteed to be the fastest solution, but it's ready to use. On a quick look-through of its documentation I didn't happen across any throttling suggestions, but you'll probably want to find some way of preventing excessive load.