Re: Counting open files?
by BrowserUk (Patriarch) on Jun 22, 2005 at 10:36 UTC
|
use POSIX ();
printf "Attempting to close fid %d : %s\n",
POSIX::close( $_ ) ? 'success' : 'failure'
for 3 .. 255;
Examine what is said, not who speaks -- Silence betokens consent -- Love the truth but pardon error.
Lingua non convalesco, consenesco et abolesco. -- Rule 1 has a caveat! -- Who broke the cabal?
"Science is about questioning the status quo. Questioning authority".
The "good enough" maybe good enough for the now, and perfection maybe unobtainable, but that should not preclude us from striving for perfection, when time, circumstance or desire allow.
| [reply] [d/l] |
|
Pity I can't do close($_) for 1..255, this would be portable.
| [reply] [d/l] |
|
| [reply] |
Re: Counting open files?
by Zaxo (Archbishop) on Jun 22, 2005 at 10:59 UTC
|
my $cpid = open my $lsof, '-|',
'/usr/sbin/lsof', '-g', $$, @other_opts
or die $!;
Without other options to narrow the list you'll get lots of irrelevant listings, including shared C libraries, the perl executable, and the terminal you're on.
| [reply] [d/l] |
Re: Counting open files?
by gellyfish (Monsignor) on Jun 22, 2005 at 11:38 UTC
|
#!/usr/bin/perl -w
+
opendir FD,'/proc/self/fd' or die "Can't open FDs - $!\n";
+
my @FDS = grep !/^\.{1,2}$/, readdir FD ;
+
print "@dirs";
Of course as someone else said you might want to make a judgement as to whether you want to close your STDIO handles :-)
/J\ | [reply] [d/l] |
Re: Counting open files?
by rev_1318 (Chaplain) on Jun 22, 2005 at 10:36 UTC
|
If you open a file, you can (and should!) close it. Where's the mistic in this? You know which files are open; you opened them didn't you?
| [reply] |
|
| [reply] |
|
Hmmm ... what version of perl and what OS are you using?
The normal behaviour for exec is to close all file descriptors from SYSTEM_MAX_FD+1 to however many you have open. I can envision only a few ways for this error to happen.
- Your os does not support close-on-exec
- the code has gone out of it's way to clear the close-on-exec flag
- $^F was munged (SYSTEM_MAX_FD - normally set to 2 because you normally want stdin, stdout and stderr
passed across exec)
- the code is re-opening stdin, stdout, stderr without first closing them
| [reply] |
|
| [reply] |
Re: Counting open files?
by anonymized user 468275 (Curate) on Jun 22, 2005 at 13:36 UTC
|
Exceeding the number of open files is an error, but it is most likely that managing them with the right kind of structure is the issue.
The way I usually do that is to use a stack just like in the old PDP-11 days. This ensures that files are opened and closed in a non-abusive way so that you can catch the abuse at the right point and may even solve the problem before it happens, just because you organised your recursive code better.
By the way, the hash-style subroutine parameters can be replaced with whatever you normally do, they are only there for clarity although if you do want to try that out too... don't forget to enclose the parameter array (@_) in brackets when reloading it into a hash or it won't work -- the point at which you can check the number of open files is marked in subroutine "Open" my %controlCentre=();
$controlCentre{ FH } = [];
$controlCentre{ DH } = [];
Traverse( tree => 'whatever-path',
control => \%controlCentre );
sub Traverse{
my %par = (@_);
my $dh = OpenDir(dir=>$par{ tree },
control=>$par{control});
foreach my $file ( grep !/^\./, readdir $dh ) {
my $path = "$par{tree}/$file";
if ( -d $path ) {
Traverse( tree => $path,
control => $par{ control }
);
else {
ProcessFile( name => $path,
control => $par{ control } );
}
}
CloseDir( dh => $dh, control => $par{ control });
}
sub ProcessFile{
my %par =(@_);
my $fh = Open( cmd => "<$par{name}",
control =>$par{control} );
# ...
# process <$fh> here with any recursive calls
# to ProcessFile that may be required
#...
Close( control => $par{ control } ),
fh=>$fh );
}
sub Open{
my %par = (@_);
open my $fh, $par{ cmd };
my $aref=$par{ control } -> { FH };
push @$aref, $fh;
# N.B. $#$aref is now one less than
# the count of open files
return $fh;
}
sub Close{
my %par = (@_);
close $par{ fh };
my $aref = $par{ control } -> { FH };
my $fh = pop @$aref;
( $fh eq $par{ fh } )
or die "fh stack was abused: "
. "$fh does not match $par{ fh }";
}
sub OpenDir{
my %par = (@_);
opendir my $dh, $par{ dir };
my $aref=$par{ control } -> { DH };
push @$aref, $dh;
return $dh;
}
sub CloseDir
my %par = (@_);
closedir $par{ dh };
my $aref = $par{ control } -> { DH };
my $dh = pop @$aref;
( $dh eq $par{ dh } )
or die "dh stack was abused: "
. "$dh does not match $par{ dh }";
}
-S | [reply] [d/l] |
Re: Counting open files?
by ambrus (Abbot) on Jun 22, 2005 at 12:59 UTC
|
You could try lowering the resource limit of open files
at startup with ulimit or [gs]etrlimit (pity
I don't know the perl interface for these),
then, just before you reexec, raise the limit to the
original value. It'd probably be enough to lower the limit by
a little, as you probably need only few filedescriptors to
reexec the program.
| [reply] |
Re: Counting open files?
by robot_tourist (Hermit) on Jun 22, 2005 at 12:26 UTC
|
How about something outside the script to save the file handles of all the files opened. Although I don't know if that would become locked. Best to close them when your done with them methinks, plus use some exception handling type stuff so that they get closed before your script dies.
How can you feel when you're made of steel? I am made of steel. I am the Robot Tourist. Robot Tourist, by Ten Benson
| [reply] |
|
Generally, if you're not affraid of 'outside', you can use proc fs popular in many unices:
opendir(D,"/proc/$$/fd/");
while ($fd=readdir(D)) {
print "I've got this descriptor opened: $fd";
};
and then, you could use BrowserUK's POSIX::close to close it;
But this is not very portable, works on linux with procfs mounted, as soon as you move to win32, you're in trouble.
I wanted something working from inside | [reply] [d/l] |