Beefy Boxes and Bandwidth Generously Provided by pair Networks
Think about Loose Coupling
 
PerlMonks  

Re^3: perl wkhtmltopdf Error: Unable to write to destination

by djlerman (Beadle)
on May 09, 2014 at 18:25 UTC ( [id://1085607]=note: print w/replies, xml ) Need Help??


in reply to Re^2: perl wkhtmltopdf Error: Unable to write to destination
in thread perl wkhtmltopdf Error: Unable to write to destination

Here is the final working solution I came up with. If there is anything I missed please let me know. I don't know if I am using "gensym()" correctly. The main thing that was an issue was rights on the server side.

#!/usr/bin/perl use warnings; use strict; use IPC::Open3; use Symbol; my $cmd = '/usr/local/bin/wkhtmltopdf - -'; my $err = gensym(); my $in = gensym(); my $out = gensym(); my $pdf = ''; my $pid = open3($in, $out, $err, $cmd) or die "could not run cmd : $c +md : $!\n"; my $string = '<html><head></head><body>Hello World!!!<br /><br /><br / +> IMG: <img id="image" src="data:image/gif;base64,R0lGODlhFwAPAKEAAP///w +AAAMzMzLi3tywAAAAAFwAPAAACQIyPqQjtD98RIVpJ66g3hgEYDdVhjThyXSA4aLq2rgp +78hxlyY0/ICAIBhu/HrEEKIZUyk4R1Sz9RFEkaIHNFgAAOw==" /> </body></html>'; print $in $string; close($in); while( <$out> ) { $pdf .= $_ } # for trouble shooting while( <$err> ) { # print "err-> $_<br />\n"; } # for trouble shooting waitpid($pid, 0 ) or die "$!\n"; my $retval = $?; # print "retval-> $retval<br />\n"; print "Content-Disposition: attachment; filename='testPDF.pdf'\n"; print "Content-type: application/octet-stream\n\n"; print $pdf;

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://1085607]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others rifling through the Monastery: (4)
As of 2025-06-25 02:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.