Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?

Re: Recursive search for duplicate files

by sh1tn (Priest)
on Nov 27, 2007 at 13:35 UTC ( #653218=note: print w/ replies, xml ) Need Help??

in reply to Recursive search for duplicate files

Much better way is to use MD5 for file comparison.

Comment on Re: Recursive search for duplicate files
Replies are listed 'Best First'.
Re^2: Recursive search for duplicate files
by moritz (Cardinal) on Nov 27, 2007 at 13:41 UTC
    If used naively, that doesn't work out well for large files, because they have to be read from disc entirely.

    If you care about performance, you might just want to hash the first 5% (or the first 1k or whatever) and see if there are any collisions, and if there are you can still look at the entire file.

      I can agree. Another measure, taking in mind the performance, can be the filesize comparison before all.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://653218]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others musing on the Monastery: (3)
As of 2015-11-26 03:38 GMT
Find Nodes?
    Voting Booth?

    What would be the most significant thing to happen if a rope (or wire) tied the Earth and the Moon together?

    Results (696 votes), past polls