in reply to Perl Program - Out of memory?
A few things superficially come to mind:
- You have several identical cases of sort keys $someHash within nested loops. If the list of keys is big, especially very big, that could consume a lot of memory. Yet I do not superficially see anything within the accompany logic that mandates the sort in order to produce correct results.
- It occurs to me that much of this logic appears to be similar to what might be obtained ... at least, in part ... by an SQL INNER JOIN query. You’ve got a lot of nested loops here, none of which seem to be doing much more than to set-up for the next inner nested loop. And, well, “if it’s combinations that you want, even with ORDER BY, then SQL is already fantastic at doing that sort of thing. How much of this “nested loop” logic might be replaceable by a single SQL query, involving many JOINs, which would present all of those combinations to you naturally as part of its result?
Pursuing the last thought, I frankly do suspect that a lot of this “nested loop” logic could, indeed be expressed as a query which might, indeed, produce many thousands of rows as a “cross-product” between several smaller constituent tables. (“And so what... that’s what SQL servers do for a living...”) But this might then serve to rather-drastically reduce the complexity and memory-footprint of your code, which now only has to consume a record-set that is presented to it.
Note also that “SQL” doesn’t have to imply “a server.” The SQLite database system, for instance, is built on single-files, and it runs quite nicely on everything from mainframes to cell-phones. My essential notion here is that maybe you can shove “all that data” out of (virtual...) memory, and into file(s).