Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Getting a List of Files Via Glob

by Zoogie (Curate)
on May 28, 2000 at 05:25 UTC ( #15179=perlquestion: print w/ replies, xml ) Need Help??
Zoogie has asked for the wisdom of the Perl Monks concerning the following question:

I need to get a list of files that fit a set of filename globs input by the user. The problem is, the files may be in a different directory. For example, if the user enters:

  $area = "public/test/";
  @files = ("*.txt", "_*");
The function should return something like:
  readme.txt, files.txt, _index, _sortopts, ... etc.
I'm trying to use the glob function, but if you specify another directory, that directory name is prepended to every filename. Here's my current function:

@file_list =
    map { /^$area(.*)/; $1 }
    glob(join(' ',map { $area.$_ } @files));
Works as expected, but that seems like a lot more work than necessary. Is there an easier / clearer / more efficient way to do this?

Comment on Getting a List of Files Via Glob
Re: Getting a List of Files Via Glob
by merlyn (Sage) on May 28, 2000 at 10:20 UTC
      I considered this, but I'd have to return the original working directory at the sub, since other subs assume that the working dir hasn't changed.

      I suppose I could save the original working dir with
      $prevdir = `pwd`;
      But executing the shell command doesn't seem like it would be much more efficient than mapping the directory name on to each glob and then chopping it off...

        Use Cwd. It's a standard module.
        use Cwd; my $dir = cwd(); chdir $otherdir; # do something chdir $dir;
I don't use glob, I use readdir
by Corion (Pope) on May 29, 2000 at 13:52 UTC
    As most of my scripts have to run under Win32, I don't use glob :) (Note: glob() is also available under Win32, but another process must be started that reads the directory in, not really elegant IMO). I use the
    DIR = opendir( "$area" ) or die "Can't read '$area' : $!"; @files = readdir( DIR ) or die "Error reading from '$area' : $!"; closedir( DIR ); # Don't care about errors here
    combo. If you simply want to process every file in a directory, instead of   @files = readdir(DIR) you could use
    while ($file = readdir(DIR)) { ... do your stuff to $file ... }

    The drawback of readdir() against glob() is of course, that you have to descend through the directories yourself if you want recursive processing, but I like programs more that give the user some feedback (like painting a dot for each directory traversed as appropriate) ...
      I often use readdir in a recursive mode, like :
      &makeTree(0,'/where/to/start'); sub makeTree { my ($level,$Dir)=@_; # Get all directories opendir(DIR,$Dir) || die "$!"; my @Dirs = grep { /^[^.].*/ && -d "$Dir/$_" } readdir(DIR); closedir(DIR); # Read files and do stuff or whatever... # Call self foreach my $currDir (@Dirs) { &makeTree($level+1,join('/',$Dir,$currDir)); } };


      /brother t0mas
        In your code, you donīt need to re-read the directory to get at the files in the directory - although it would be difficult to "compute" the difference of the array elegantly (well, difficult to me at least). I mostly use the following way, intermixing files and directories :
        # Untested code - use at your own risk sub handledirectory { my ($directory) = @_; my ($entry, @direntries); opendir( DIR, $directory ) or die "Canīt read $directory : $!\n"; @direntries = readdir( DIR ) or die "Error reading $directory : $! +\n"; closedir DIR; foreach $entry, @direntries { # File::Spec gives us cross-platform path utilities # and comes with every Perl standard distribution require File::Spec; my $fullpath; # skip current and parent directory entries next if $entry =~ /^\.\.?$/; $fullpath = File::Spec->catfile( $directory, $entry ); if (-d $fullpath ) { &handledirectory($fullpath); } elsif ( -f $fullpath ) { # This second call to stat() (implicit in the "-f") # could be done away by using some other short # variable that does caching, but that would maybe # confuse the readers ... ... do stuff ... } else { # something strange ... }; }
      Hurm... is there a way readdir() will work with specifications like "*.txt" or "dir??.log"?

        readdir() will always read in a complete directory, there is no way around that - and for what I know, there is no function in Perl to hint to the operating system that you are only interested in a certain subset of the files either, as Perl comes from a UNIX background and Unix does not have the notion of wildcards in the file system. Wildcard expansion is always done by the shell under UNIX.

        To accomplish what you want done, you do (untested!) the following :

        opendir DIR, $directory or die "Couldn't open $directory : $!\n"; @files = readdir( DIR ) or die "Couldn't read from $directory : $!\n +"; closedir( DIR ); @files = grep @files, { /\.txt$/ && -f $_ };
        (I hope that this test thing in grep() works the way I want it. The RE part checks if the name ends with ".txt", and the -f part checks if the name corresponds to a file (and not a directory)). Another solution for you could be to let the user specify all files (in UNIX-style) on the command line, there even is a module called GlobArgv, which does wildcard expansion for you automagically (under Win32).

Re: Getting a List of Files Via Glob
by BBQ (Deacon) on May 29, 2000 at 19:32 UTC
    Does this mean that glob will now return . and .. too? (I haven't upgraded yet)

    #!/home/bbq/bin/perl
    # Trust no1!

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://15179]
Approved by root
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (8)
As of 2014-12-28 03:56 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (178 votes), past polls