Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl: the Markov chain saw
 
PerlMonks  

make infinite loop for live directory

by GHMON (Novice)
on Oct 27, 2019 at 16:38 UTC ( #11108019=perlquestion: print w/replies, xml ) Need Help??

GHMON has asked for the wisdom of the Perl Monks concerning the following question:

Hi

i want to make loop for a directory for process all files but in that directory always another processor write new file so i don't want to process all file again

i put my code

foreach my $file ( glob('/root/fd/fg/*.txt') ) { open (MYFILE, "<$file") or die $! ; my $ad ; my @rt ; while (<MYFILE>){ if ($_ =~ m/ad/) { ($_ =~ tr/a-z,=//d ) ; $ad = $_ ; } if ($_ =~ m/rt /){ ($_ =~ tr/a-z,=//d) ; push (@rt , $_); } } }

Replies are listed 'Best First'.
Re: make infinite loop for live directory
by Your Mother (Archbishop) on Oct 27, 2019 at 17:09 UTC

    Probably what you want is to do something like youíve already got once while also setting up a watcher to handle only new/changed things. I have used Linux::Inotify2 and there are some other packages like File::ChangeNotify; I donít have direct experience with that one but use it often in Catalyst testing with its tools.

Re: make infinite loop for live directory
by tybalt89 (Prior) on Oct 27, 2019 at 17:15 UTC

    Something like this? Untested !!

    #!/usr/bin/perl use strict; # https://perlmonks.org/?node_id=11108019 use warnings; my %processed; while( 1 ) { my %files = map +($_ => 1), glob '/root/fd/fg/*.txt'; for my $file ( grep !$processed{$_}, keys %files ) { # open and process $file here } %processed = %files; # clean up %processed sleep 10; # or some useful sleep time }

    Warning - still Untested !!

Re: make infinite loop for live directory
by Laurent_R (Canon) on Oct 27, 2019 at 21:31 UTC
    You could keep track of a time stamp on the last file fetch date and retrieve only the files whose last modification date is more recent.

    Of course, if the directory is very dynamic, you might get files that are still being written (depending on your OS: for example, it wouldn't be possible to read a file that is being written under VMS, but that's entirely possible under most Unix/Linux flavors). That's not the question you asked, but it is often possible to find a delay that will prevent working on files being written.

Re: make infinite loop for live directory
by Marshall (Canon) on Oct 29, 2019 at 06:42 UTC
    One way to accomplish this is with a Perl module that will notify you when a directory changes.
    Perhaps: File::Monitor?
    There could be other modules for this.

    You could keep a list of files that have been processed.
    When a new file appears, process that new one.

    Adjust your list of processed files as each new one comes in.
    There can be some issues with this, please explain your application more.

Re: make infinite loop for live directory
by cavac (Curate) on Oct 28, 2019 at 15:15 UTC

    First of all, i took the liberty of beautifying your code, removing the use of $_ in favour of named variables and added the use strict; use warnings; stuff that makes your code easier to debug. I also took the liberty of introducing the magic of use Carp; and use English;. Hope you are not angry about that ;-)

    One way to mark files done is to add a second empty ".done" file and checking for that. If some other programs removes the .txt files, you could just as easy check for ".done" files without corresponding ".txt" files and remove them.

    Here is a simple version of that (untested since i only had a few minutes and couldn't be bothered to recreate your directory structure):

    #!/usr/bin/env perl use strict; use warnings; use Carp; use English; # Do some work foreach my $file ( glob('/root/fd/fg/*.txt') ) { next if(-f $file . '.done'); # Skip if '*.done' file exists open(my $ifh, '<', $file) or croak($ERRNO); my $ad; my @rt; while((my $line = <$ifh>)) { if($line =~ m/ad/) { ($line =~ tr/a-z,=//d ) ; $ad = $line ; } if ($line =~ m/rt /){ ($line =~ tr/a-z,=//d) ; push (@rt , $line); } } close $ifh; # Need to close fille after working on it. open(my $ofh, '>', $file . '.done') or croak($ERRNO); # Create '*. +done' file close($ofh); } # Cleanup foreach my $file ( glob('/root/fd/fg/*.txt.done') ) { my $realfname = $file; $realfname =~ s/\.done$//; if(!-f $realfname) { unlink $file; } }

    If no other program works with the files after your program, you might as well just rename them.

    #!/usr/bin/env perl use strict; use warnings; use Carp; use English; use File::Copy; # Do some work foreach my $file ( glob('/root/fd/fg/*.txt') ) { open(my $ifh, '<', $file) or croak($ERRNO); my $ad; my @rt; while((my $line = <$ifh>)) { if($line =~ m/ad/) { ($line =~ tr/a-z,=//d ) ; $ad = $line ; } if ($line =~ m/rt /){ ($line =~ tr/a-z,=//d) ; push (@rt , $line); } } close $ifh; # Need to close fille after working on it. # Rename/move file move($file, $file . '.done'); }

    There is potentially another problem. If the external program that generates these files writes to a file while you are still reading it, you might get an incomplete read - which is damn hard to debug and can give you a lot of grey hairs. One workaround hack to that is to get the filelist, then wait a few seconds and then process the file list. Something like this should work:

    ... # Do some work my @files = glob('/root/fd/fg/*.txt'); sleep(4); foreach my $file(@files) { ...

    perl -e 'use MIME::Base64; print decode_base64("4pmsIE5ldmVyIGdvbm5hIGdpdmUgeW91IHVwCiAgTmV2ZXIgZ29ubmEgbGV0IHlvdSBkb3duLi4uIOKZqwo=");'

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://11108019]
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others drinking their drinks and smoking their pipes about the Monastery: (1)
As of 2021-12-08 07:17 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    R or B?



    Results (34 votes). Check out past polls.

    Notices?