Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re: How to open files found via File::Find

by gam3 (Curate)
on Jan 08, 2010 at 00:33 UTC ( [id://816195]=note: print w/replies, xml ) Need Help??


in reply to How to open files found via File::Find

You are using single quotes in you open statements. try
open INFILE2,'<', $localfile ...
and you should have better luck. You also need 2 backslashes in a double quoted string. Or use $directory . '\' . $name.
use strict; use warnings; use Cwd; my $startdirectory = cwd; print "Starting directory: $startdirectory\n"; use File::Find; find ( \&inFiles, ('.') ); exit; sub inFiles { chomp; my $name = $_; my $directory = cwd; if ( $name =~ /.dat/) { print "dir = $directory\tname = $name\n"; open INFILE,'<', "$directory/$name" || warn ("Cannot open input fil +e $name\n"); my @infilecontent = <INFILE>; print length @infilecontent; my $localfile = "$directory\\$name"; #win32 open INFILE2,'<', $localfile || warn ("Cannot open input file $loca +lfile\n"); @infilecontent = <INFILE2>; print @infilecontent; } } #inFiles
-- gam3
A picture is worth a thousand words, but takes 200K.

Replies are listed 'Best First'.
Re^2: How to open files found via File::Find
by roboticus (Chancellor) on Jan 08, 2010 at 14:07 UTC

    For better portability, I'd suggest using "/" rather than "\\" as a directory separator.

    ...roboticus

      Or even better use File::Spec->catfile.
      -- gam3
      A picture is worth a thousand words, but takes 200K.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://816195]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others scrutinizing the Monastery: (2)
As of 2025-07-20 18:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.