Beefy Boxes and Bandwidth Generously Provided by pair Networks
There's more than one way to do things
 
PerlMonks  

Forking and shelling out to curl

by Argel (Prior)
on Mar 26, 2019 at 01:50 UTC ( #1231678=perlquestion: print w/replies, xml ) Need Help??

Argel has asked for the wisdom of the Perl Monks concerning the following question:

Hello fellow Monks! I'm trying to write a daemon that runs a curl command.
use IO::Socket; use POSIX qw(WNOHANG setsid); sub daemonize { $SIG{CHLD} = 'IGNORE'; # Configure to autoreap zombies die "Can't fork" unless defined ( my $child = fork ); # FORK +<<<<<<<<<<<< CORE::exit(0) if $child; # Parent exits setsid(); # Become session leader open( STDIN, "</dev/null" ); # Detach STDIN from shell open( STDOUT, ">/dev/null" ); # Detach STDOUT from shell open( STDERR, ">&STDOUT" ); # Detach STDERR from shell chdir '/tmp'; # Change working directory umask(0); # Reset umask $ENV{PATH} = '/bin:/sbin:/usr/sbin'; # Reset PATH }

Looks like my script is hanging when I daemonize it. I'm logging to a CSV file, and I finally added some code in tracing this back to where it is hanging, and it looks like it's when I shell out.

sub cmd_wrapper { my $cmd = shift; my @out; eval { local $SIG{ALRM} = sub { die "alarm\n" }; alarm 15; ## (CSV output logged) ## @out = qx[$cmd]; #<<<<< Shell out ## (CSV output *NOT* logged) ## alarm 0; }; return $@ if $@ ne "" return wantarray ? @out : join '', @out; }
What's even stranger is if I uncomment my $? == -1 check in the subroutine that calls cmd_wrapper I see log entries indicating it failed (though I appear to be getting valid results back; guessing $? doesn't work well when forking). But I'm at loss to explain how code appearing after I shell out affects the shelling out behavior.
## Run the Curl command ## my $restURL = $opt_ref->{restURL} . $opt_ref->{pagingURL}; say "DEBUG: \$restURL: \]$restURL\[" if $cfg{debug} or $cfg{test}; my $cmd = "$curlCommand -X $opt_ref->{method} $restURL"; my $results = cmd_wrapper( $cmd, $opt_ref, $info_ref, $cfh{csv_exp +ort}, \@rpt_hdr, 1 ); say 'DEBUG: $results: ', Dumper $results if $cfg{debug}; #or $cfg{ +test} > 1; $info_ref->{name} = $?; $info_ref->{primary} = $cmd; $info_ref->{status_msg} = $results; #=pod # Uncommented I see results, Commented out cmd_wrapper seems to hang. ## CASE: Connection issue if( $? == -1 ) { # WARN: Will this still work now that I'm call +ing cmd_wrapper()??? $info_ref->{entry_level} = 'DEBUG'; $info_ref->{status} = 'runCurl connection failed'; csv_log( $opt_ref, $info_ref, $cfh{csv_export}, \@rpt_hdr, 1 ) +; return; #<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<<< +RETURN } #=cut

Am I missing something simple? Something with buffering? Or hitting an optimization problem? Something to do with that return? Or....?

This is perl 5, version 16, subversion 3 (v5.16.3) built for x86_64-li +nux Red Hat Enterprise Linux Server release 5.11 (Tikanga)

Elda Taluta; Sarks Sark; Ark Arks
My deviantART gallery

Replies are listed 'Best First'.
Re: Forking and shelling out to curl
by ikegami (Pope) on Mar 26, 2019 at 05:07 UTC

    Why not access libcurl via Net::Curl instead of via curl?

      Thanks! We have a bug that we're trying to monitor for, and I'm trying to mash a modified perl+curl script I inherited with an monitoring script I wrote over a decade ago. So ugly-but-working is the bar I'm trying to hit. I'll take a look at Net::Curl -- my call to curl is in one sub, so I may be able to drop this in without too much pain. Ill update you on how it goes!

      Elda Taluta; Sarks Sark; Ark Arks
      My deviantART gallery

        Looks like our libcurl is too old (3.x). I started looking into LWP::UserAgent, but the SSL headers are not on the server, which I assume rules out just about every other https module. :( :/

        So, I returned back to figuring out why shelling out is causing problems, and it looks like a large part of my problem was a sleep 300 that was supposed to be a sleep 3 for testing and the rest was my $? check, which doesn't work as expected when forking (or maybe it's tied to signal handling). The downsides of not having a second pair of eyes to spotcheck the code.

        Thanks for the help though! Trying to get Net::Curl and then LWP::USerAgent working let me return back to the code with a fresh set of eyes.

        EDIT: For some reason, nothing gets logged the first run through the main while(1), so the sleep makes it look like the script is hanging, so I ended up chasing the wrong issue.

        Elda Taluta; Sarks Sark; Ark Arks
        My deviantART gallery

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlquestion [id://1231678]
Approved by marto
Front-paged by haukex
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others taking refuge in the Monastery: (8)
As of 2020-01-27 19:00 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?
    Notices?