Re: Reading a file live!
by Corion (Patriarch) on Mar 07, 2014 at 08:39 UTC
|
How about just using File::Tail (which works under Unixish-enough OSes), or spawning a pipe from tail -f?
| [reply] [d/l] |
|
Hi Corion,
Thank you for the suggestion.
I know that with File::Tail it should work.
But i really wanna make my life harder and try something else :).
| [reply] |
Re: Reading a file live! (poll)
by Anonymous Monk on Mar 07, 2014 at 08:00 UTC
|
How can i do this without blocking my master script? see Proc::Background and Poll :) non-blocking is called polling
for example
use Proc::Background;
my $proc1 = Proc::Background->new($command, $arg1, $arg2);
while( 1 ){
sleep 1;
my $newsize = -s $fhorfilename;
# if( ( $newsize - $oldsize ) > $minimum ){
if( ( $newsize - $oldsize ) > $buffersize ){
}
my $readed = read $fh, $buffer, $buffersize;
doStuff( $readed , $fh, $buffer, $buffersize );
...
last if not $proc1->alive;
}
| [reply] [d/l] |
|
Hello,
Thank you for the suggestions.
But do you know another way to do it, without using this modules (Proc and Schedule) ?
Thank you
| [reply] |
|
Of course, you don't have to use Proc::Background, it's just a nicety and saves you dealing with all the low-level wrangling involved with using fork in a robust fashion. But by all means feel free to use fork natively if you would rather. TIMTOWTDI, after all.
| [reply] |
|
|
But do you know another way to do it, without using this modules (Proc and Schedule) ?
Why? Maybe I'll have some ideas :)
But, this is like the lowest common demoninator of portability and ease :)
| [reply] |
|
Re: Reading a file live!
by Laurent_R (Canon) on Mar 07, 2014 at 07:48 UTC
|
In principle, it should not be necessarily blocking your main script, but this depends on many things, like what is your OS, how your secundary script is writing into the file, whether it closes the file when it is done, whether your secundary script is permanent or temporary, whether you are waiting for any specific output in the file, too many things that we don'y know.
I also wonder whether this is the best way of doing things. If you want to read the output of the secundary script in your main script, why do you need a secundary script at all? I am not saying it is a bad idea, I don't know enough, I am just questionning whether it is a good idea. Please tell us more about what you are trying to achieve.
| [reply] |
|
I will try to explain better.
#main script
...
... #i am doing something
system ("perl eck.pl $var > file.txt &"); #here i call my other script and i redirect the output into the file.txt. My eck.pl script will run some time but in my master script i need to use the newly added lines (always) from the txt file
I need to read that file always ...i can do that but the problem is that is blocking my main script and i can;t use the new lines in my master script. In a way i am trying to read that file "live" get the lines and use them inside my master script.
Hope you understand what i am trying to say.
| [reply] [d/l] [select] |
Re: Reading a file live!
by golux (Chaplain) on Mar 07, 2014 at 19:38 UTC
|
Hi negativ_m,
My suggestion would be to assign a variable to the length of the file prior to the first append. Then, just check the filesize; when it grows you know you've got more text to read:
my $len = (-s "file.txt") || 0;
system ("perl eck.pl $var > file.txt &");
# ... later ....
my $newlen = (-s "file.txt") || 0;
if ($newlen > $len) {
my @newlines = read_new_lines("file.txt", $len);
$len = $newlen; # Adjust length to new end of file
}
The subroutine read_new_lines() would look something like this:
sub read_new_lines {
my ($fn, $offset) = @_;
use IO::File;
my $fh = new IO::File($fn);
$fh or die "Failed to read '$fn' ($!)\n";
seek($fh, $offset, 1) or die "Failed seek ($!)\n";
chomp(my @lines = <$fh>);
return (@lines);
}
I use this trick a lot for webpages which need to monitor continuously updating logfiles.
say
substr+lc crypt(qw $i3 SI$),4,5
| [reply] [d/l] [select] |
Re: Reading a file live!
by RedElk (Hermit) on Mar 07, 2014 at 15:46 UTC
|
Seems like you could define a subroutine that reads the file in question (and does whatever processing is required). Then strategically place calls to that subroutine within your master script. But then again, I may need more coffee.
| [reply] |
Re: Reading a file live!
by Laurent_R (Canon) on Mar 08, 2014 at 16:29 UTC
|
At least on Unix and Unix-like systems, it seems to me that your master script should not be blocked. This is a quick test. First, the subprogram, which is turning off IO buffering and writing a number every 3 seconds:
# sub_program
use strict;
use warnings;
{ my $fh = select STDOUT; # making STDOUT "hot" (no IO buffering)
$| = 1;
select $fh;
}
for my $i (0..100) {
print "$i \n";
sleep 3;
}
Now the master program, which is reading every 10 seconds the full file produced by the subprogram and prints out the last line:
# main program
use strict;
use warnings;
system ("perl sub_program.pl > file.txt &");
my $last_line;
for my $i (0..100) {
sleep 10;
local @ARGV = ("file.txt");
$last_line = $_ while <>;
print "iteration $i : $last_line";
}
Now running the master program:
$ perl test_sub.pl
iteration 0 : 3
iteration 1 : 6
iteration 2 : 9
iteration 3 : 13
iteration 4 : 16
...
As you can see, the master program ("test_sub.pl") is perfectly able to read the file generated by the subprogram, even though the subprogram is still running. Maybe your problem is that you did not think about turning off IO buffering.
| [reply] [d/l] [select] |