Beefy Boxes and Bandwidth Generously Provided by pair Networks
Keep It Simple, Stupid

Re: Re: sorting large data

by simeon2000 (Monk)
on Jul 23, 2002 at 16:24 UTC ( #184472=note: print w/replies, xml ) Need Help??

in reply to Re: sorting large data
in thread sorting large data

But how would you do this kind of sort if you don't have access to gnu sort? I have pondered this for a while and have not come up with any efficient solutions. Certainly loading up a 45MB text file into RAM is not the answer.

"Falling in love with map, one block at a time." - simeon2000

Replies are listed 'Best First'.
Re: sorting large data
by Abigail-II (Bishop) on Jul 23, 2002 at 16:32 UTC
    Well, if don't have access to GNU sort, you can always try one of the many other implementations of Unix sort.... ;-).

    Anyway, you would do as Unix sort would do. Split up the data in sizes that you can swallow (how much that is depends from system to system). Sort that, and store it in a temporary file. Now you have a bunch of sorted files - and you have to merge them. You even might have to do this recursively.

    Read Knuth if you want to know everything about merge sort.


      Short of going out and buying books (with a limited cash supply), is there anywhere on the web with this sort of information?
Re: Re: Re: sorting large data
by Fletch (Bishop) on Jul 23, 2002 at 16:29 UTC

    Any good algorithms book should cover sorting. See Knuth volume 2, or Orwant et al Mastering Algorithms with Perl.

Log In?

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://184472]
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others having an uproarious good time at the Monastery: (4)
As of 2023-03-21 08:38 GMT
Find Nodes?
    Voting Booth?
    Which type of climate do you prefer to live in?

    Results (59 votes). Check out past polls.