Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

Re^2: Segmentation fault: problem with perl threads

by katharnakh (Novice)
on Sep 15, 2008 at 12:08 UTC ( [id://711436]=note: print w/replies, xml ) Need Help??


in reply to Re: Segmentation fault: problem with perl threads
in thread Segmentation fault: problem with perl threads

Hi,

Thanks for the quick reply. I am actually using following module.

use Log::Log4perl qw(get_logger); use Log::Log4perl::Appender; use Log::Log4perl::Layout; use XML::Simple; use File::stat; use FileHandle; use threads; use Net::SFTP;

For trying with upgraded version of perl, i have to look into it.

Thanks very much,
katharnakh.

Replies are listed 'Best First'.
Re^3: Segmentation fault: problem with perl threads
by zentara (Archbishop) on Sep 15, 2008 at 13:33 UTC
    That is a large number of modules that you require to be "thread-safe". Threads get an exact copy of the parent thread at the time of creation, and if for some reason (and it usually happens) a thread dosn't clean itself up completely, it will hang around wasting memory, which then gets incorporated into the next thread. Since you are not sharing data between threads in realtime, and apparently logging to a file, you could easily switch to a forked solution and save all the hassles you are experiencing with threads.

    I would be particularly worried about the thread-safety of Net-SFTP, a quick google for "Net::SFTP thread safety" indicates it is not safe for thread usage.


    I'm not really a human, but I play one on earth Remember How Lucky You Are
      hi,

      Could you please direct me to read through forked way of doing parallel processing?

      Thanks,

      katharnakh.

        All you do is drop your thread code block into a forked child's code block. Not recomended, but the simplest is:
        if(fork() == 0){exec("command")}
        but that dosn't watch for zombies, or limit the number of forks at any one time.

        Usually, you do something like:

        #!/usr/bin/perl # There is a limit to the number of child processes you can # have, or should want, so big jobs may require the kind of # throttling Parallel::ForkManager gives you. #The receipe for 100 processes: #avoid zombies $SIG{CHLD} = 'IGNORE'; # check if it works on your system for (1..100) { my $pid = fork; next if $pid; # in parent, go on warn($!), next if not defined $pid; # parent, fork errored out exec @cmd; # in child, # go do @cmd and don't come back }
        There are a bunch of recipes around for limiting the number of forks running at any one time, but you are best off using Parallel::ForkManager See: controlling child processes

        I'm not really a human, but I play one on earth Remember How Lucky You Are

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://711436]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others romping around the Monastery: (4)
As of 2024-03-28 14:38 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found