Beefy Boxes and Bandwidth Generously Provided by pair Networks
Your skill will accomplish
what the force of many cannot
 
PerlMonks  

Re: "Out of memory" problem

by rpnoble419 (Pilgrim)
on Nov 30, 2012 at 22:04 UTC ( [id://1006527]=note: print w/replies, xml ) Need Help??


in reply to "Out of memory" problem

This is why they created databases. Create an SQLite database and then you can sort anyway you want.

Replies are listed 'Best First'.
Re^2: "Out of memory" problem
by BrowserUk (Patriarch) on Dec 01, 2012 at 00:57 UTC

    That's my 'Dumb Answer of the Week Month' winner!


    With the rise and rise of 'Social' network sites: 'Computers are making people easier to use everyday'
    Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
    "Science is about questioning the status quo. Questioning authority".
    In the absence of evidence, opinion is indistinguishable from prejudice.

    RIP Neil Armstrong

Re^2: "Out of memory" problem
by sundialsvc4 (Abbot) on Dec 03, 2012 at 04:50 UTC

    Agree with BrowserUK ... 500 million integers is a lot to index, and if you aren’t searching for anything, it’s pure overhead to get “sorted answers” that way.   But a good external sorting package would have no particular difficulty.

    Ideally, you would arrange the whole data-processing flow which includes this file so that everything gets put into a known sort-sequence early and things are done in such a way as to keep it that way from one step to the next.   So you might have a 500 million record master-file which is simply “known to be” sorted, and you manipulate that file in ways that require it to be that way and which keep it that way.   This avoids searching, and it avoids repetitive sorting.   It also avoids indexes and the overhead of the same.   At the same time, though, you do not want to schleb a bunch of data through disk-reads and disk-writes if you are not actually doing anything with most of it.

    Obviously, RAM is the fastest resource and it avoids I/O entirely ... provided that virtual-memory swapping is not going on, which can be killer.   Your strategy entirely depends on your situation, and sometimes you can get a lot of mileage simply by chopping a large file into smaller chunks so that each one does fit in the RAM that you have without swapping.

    The key here is ... “without swapping.”   If you are doing high volume processing “in memory” to avoid I/O, but push the limit so that you start to swap, not only is “I/O going on,” but it can be of a particularly murderous kind.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1006527]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others about the Monastery: (6)
As of 2024-03-19 03:13 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found