Beefy Boxes and Bandwidth Generously Provided by pair Networks
Don't ask to ask, just ask
 
PerlMonks  

Re^3: Noob could use advice on simplification & optimization

by temporal (Pilgrim)
on May 04, 2012 at 14:25 UTC ( [id://968927]=note: print w/replies, xml ) Need Help??


in reply to Re^2: Noob could use advice on simplification & optimization
in thread Noob could use advice on simplification & optimization

When you run a recursive directory search and it encounters some large binary file or something in a hidden away subdirectory you're going to be loading quite a bit into memory. Where it will really get bad is when you get a file larger than your system's memory (or more accurately, the memory allocated to a Perl process).

The easiest way to avoid this is to read the files line by line. But you could also write an smart read method which would buffer your reads in a limited length array, giving you something of the best of both worlds.

The other advantage to slurping the file is you avoid splitting the file on newlines into an array.

You might want to add some filename filtering so the user can exclude/include certain file types.

Strange things are afoot at the Circle-K.

  • Comment on Re^3: Noob could use advice on simplification & optimization

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://968927]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others exploiting the Monastery: (4)
As of 2024-04-19 23:10 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found