http://www.perlmonks.org?node_id=1000458

rovf has asked for the wisdom of the Perl Monks concerning the following question:

Part of our application, which runs in a mixed Windows / Linux / Solaris environment, is a small Perl program, which ats ome point executes
if(-e $rule_file) { if(open(my $rule_fh,'<',$rule_file)) { .... } else { die "Can not read file $rule_file : $!"; } }
The variable $rule_file contains one of two values, depending on whether the script runs on Linux/Solaris, or Windows, but in both cases, it points (absolute path) to the same file. On Linux and Solaris, the file is mounted via NFS. On Windows, the file is accessed via CIFS, and $rule_file is an UNC path.

This small program is executed frequently. It might well be that at a certain point in time, a dozen processes on various Linux-, Solaris- and Windows processes execute the script and try to open the file. Since the file is only opened for reading, I believe that this should not be a problem.

From our logs, I see however that the open occasionally fails with

Can not read file ....... : Invalid argument

This happens rarely (maybe in 1 out of 1000-2000 executions), and if it happens, it always fails on Windows, never on Linux or Solaris. It looks very much like a concurrency issue, but the wording of the error message, "invalid argument", surprises me.

Any idea what could be the reason? Maybe some Windows or CIFS quirk, when many processes try to open a file at the same time?

-- 
Ronald Fischer <ynnor@mm.st>