sandy1028 has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I am using the code as below.
Log file is created and all the process write to the same log file.
Some of the lines are merged together.
How can this be avoided.
I cannot use lock($log)
If I use sleep(1); the performance is very slow.
my $pm = new Parallel::ForkManager($tc+1); $pm->run_on_finish( sub { my ($pid, $exit_code, $ident) = @_; $tmp +Files[$ident] = undef; } ); foreach my $i (0..$#tmp) { # Forks and returns the pid for the child: my $pid = $pm->start($i) and next; $SIG{INT} = 'DEFAULT'; my $filename = $tmp[$i]->filename(); my $file = IO::File->new("<$filename") or die "Can't open $filen +ame\n"; while((my $line) = $file->getline()) { last unless defined($line); chomp $line; my ($dir, $file) = split(/\t/, $line); # my $process = shift; is created above # Calling this from another file $process->($dir, $file, $config, $log); } $pm->finish; # Terminates the child process }

Replies are listed 'Best First'.
Re: logging of process
by Anonymous Monk on Apr 28, 2009 at 07:08 UTC
Re: logging of process
by apl (Monsignor) on Apr 28, 2009 at 09:21 UTC
      I am using Linux.
      I have to create 5 process. The child process is 100 and all should write to the log file in synchronizing.
      how can I synchronize writing to log file.
Re: logging of process
by BrowserUk (Patriarch) on Apr 28, 2009 at 07:07 UTC