Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re: Re: Super Duper Inefficient

by Earindil (Beadle)
on Jun 20, 2003 at 13:19 UTC ( [id://267560]=note: print w/replies, xml ) Need Help??


in reply to Re: Super Duper Inefficient
in thread Super Duper Inefficient

Would opening and closing files thousands of times be more of a hit than just 80 or so times?
If generating 5 day graphs @DATA would be approx 288*5*80 lines.
Would it perhaps be better to open all the files first before the loop and then close them all after?

Replies are listed 'Best First'.
Re^3: Super Duper Inefficient
by particle (Vicar) on Jun 20, 2003 at 13:26 UTC

    sure, you could open all files first, and store the filehandles in a hash, keyed by servername. depending on the size of the records, you could also process the data in one loop, store it in memory, and write it all out in a second loop. there are many ways to do it. you might try benchmarking a few to see what works best for you.

    ~Particle *accelerates*

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://267560]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (5)
As of 2024-04-19 23:04 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found