Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

Re: A very odd happening (at least. . . to me)

by maverick (Curate)
on Jun 24, 2002 at 16:09 UTC ( #176843=note: print w/replies, xml ) Need Help??


in reply to A very odd happening (at least. . . to me)
in thread Processing large files many times over

From a quick glance at the code, one of the first questions that comes to mind is "how many files are in these directories?" I suspect that part of the source of your slowness is that you read both the entire list of files, and the entire contents of each file into memory. If you alter your reading structure like so:
open(DIR,"$base_dir\\$dir") or die "$dir failed to open: $!"; while (my $file = readdir(DIR)) { next unless $file =~ /\.txt$/; # etc, etc. open(IN,"$full_name") || die "can't open $!"; while (my $line = <IN>) { # processing } close(IN); } closedir(DIR);
you won't have the overhead of all the memory allocation. In your second example there's a system call to a secondary perl script. That's going to be time consuming too. Consider making the second perl program a subroutine...that will avoid a fork, exec, and compile for every file you have.

HTH

/\/\averick
OmG! They killed tilly! You *bleep*!!

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://176843]
help
Chatterbox?
[erix]: oh, I see... it's pretty obvious really
[choroba]: What do you mean by "anchors"?
[erix]: anchors
[erix]: I have figured it out. I actually constructed them correctly but used a faulty url *headdesk*

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (10)
As of 2018-06-19 12:36 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    Should cpanminus be part of the standard Perl release?



    Results (113 votes). Check out past polls.

    Notices?