Beefy Boxes and Bandwidth Generously Provided by pair Networks Cowboy Neal with Hat
Problems? Is your data what you think it is?
 
PerlMonks  

Re: Finding Files and Processing them iteratively

by rupesh (Hermit)
on Feb 25, 2005 at 07:40 UTC ( [id://434437]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Finding Files and Processing them iteratively


You can also use File::Find or File::Recurse.
#!c:\perl\bin\perl.exe use strict; use File::Recurse; use File::Find; my (@filearr, $path, $prod_file, $files, %files, %all); sub recurse { my $path=shift; my $matchpattern=shift; %files = Recurse(["$path"], { match => "$matchpattern", nomatch => '' }); if (scalar keys %all) { @all{keys %files} = values %files; } else { %all=%files; } } { recurse("c:\\data", "\."); foreach (sort keys %all) { my $dirs=$_; foreach (@{ $all{"$_"} }) { $files=$_; my $fullname="$dirs"."\\"."$files"; push @filearr, "$fullname\n"; } } foreach my $machine (@filearr) { chomp; open FH, "<$machine"; my $ctr=0; foreach (<FH>) { chomp; $ctr+=$_; } close FH; $all{$machine}=$ctr; } }


Cheers,
Rupesh.

Replies are listed 'Best First'.
Re^2: Finding Files and Processing them iteratively
by satchm0h (Beadle) on Feb 25, 2005 at 08:24 UTC
    If you decide to go with File::Find and are Unix find savvy, check out the handy find2perl script. Write a find command line that does what you want and then replace 'find' with 'find2perl' and it will generate the code for you.

    Although for your specific task, File::Find may be overkill. The various glob solutions others have posted should serve you well.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://434437]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.