Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot

Re: Delete duplicate data in file

by ptum (Priest)
on Nov 21, 2005 at 04:51 UTC ( #510362=note: print w/ replies, xml ) Need Help??

in reply to Delete duplicate data in file

I don't really like having to depend on the files being sorted. One alternative way to remove duplicate data is to use a hash to temporarily hold your data. You can read in the data from the files, place it in a hash, and then (eventually) write it back out again. Since hashes depend on a unique key, you'll overwrite any prior duplicate data rows in the hash and end up with only one copy of each unique element.

Comment on Re: Delete duplicate data in file
Re^2: Delete duplicate data in file
by pg (Canon) on Nov 21, 2005 at 05:05 UTC

    It is not whether you "depend on the files being sorted", but whether it is a fact that the file is sorted.

    If it is (for example it could be some sort of log), to hold the entire file in momery is then a waste.

Log In?

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://510362]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others chilling in the Monastery: (19)
As of 2015-05-27 13:20 GMT
Find Nodes?
    Voting Booth?

    In my home, the TV remote control is ...

    Results (534 votes), past polls