Unless your FTP session has means to unpack, say, a .tar file, I think you're kind of SOL. Perhaps your web admin would be receptive to installing the module for you? Maybe he'd be receptive to unpacking a .tar file for you? Bundle up the installed XML::DOM (or perhaps just a subset of things in your perl5 lib directory to be sure you get enough Perl modules that XML::DOM will require).
If you're really really desperate, you can probably go through and manually build a perl lib directory with FTP commands, putting each of the files XML::DOM installed. This is assuming that the target OS is compatible with the source OS. You might be able to write a Perl script to do this for you, and even then there's no guarantee it'll work.
Another option might be to use RPC somehow and Storable with a 3rd-party system that is more flexible with the stuff it installs. Have your CGI execute a procedure on a remote system to do all the weird parsing stuff, and have the remote system return data structures that represent the results. This is, in my opinion, a relatively horrible way to solve your problem, but it is a way. | [reply] [Watch: Dir/Any] |
You could possibly have a CGI script for file uploads.
If such a module is avalible on the system Archive::Tar
would make things easy. Otherwise you would have to pipe data through
tar and gunzip with IPC::Open2 or if that is unavalible using
temporary files.
Once you have the archive unpacked you could possibly
(still within the upload cgi) run make within that directory.
Then you could move that directory to where you want it,
and use lib dir
| [reply] [Watch: Dir/Any] [d/l] |
Well, I'll just point out that anything you can do from the shell, you can do from Perl with system(), backticks, or piped open(). Whether that's a good solution to this problem, I can't say. :) | [reply] [Watch: Dir/Any] |
This was the shortest and imho the best answer of any of them (TMTOWTDI!). Yes, it definitely can be done. My node here shows my results.
For those who are reading after viewing that node, I will note that the amount of time I spent on setting up directories on the FTP server for my Web site and making the initial uploads of <CITE>pmake</CITE> and the <CITE>Make.pm</CITE> module was fairly large. This isn't a quick fix but once you get it set up it will perhaps serve you well?
Intrepid
| [reply] [Watch: Dir/Any] |
I had to deal with something similar for a site I've worked on. This only works for compiled modules where your machine is binary compatible, though.
I made a directory under my ftp root called perl-mods. I installed the modules on my computer, then moved ftp'd up the ones I needed into my perl-mods directory. You must be careful to keep the paths preserved so they work right, of course. My computer ran Redhat 6.2 and so did the server, so any more complicated modules compiled the same. Modules that are just perl this doesn't matter for.
Then, in all my scripts at the tope I put a use lib /home/mine/perl-mods to use that directory. Of course, I know that is where my home is on the server because the admin was cordial enough to at least tell me that. If you don't know the path to your perl-mods directory, you have a harder time. I think you could put perl-mods underneath your cgi-bin (/cgi-bin/perl-mods/). Then do a use lib ./perl-mods I think that this would work since its a relative path. I know it works with an absolute pathname, and should work for relative too. Not quite the most secure I guess, but might be the best you can get. | [reply] [Watch: Dir/Any] [d/l] [select] |
Something else that I've used - which is a little more
work - is to install the module locally and then recreate
the directory structure on the server using FTP. Obviously
this is only manageable for reasonable simple modules and
won't work at all for modules with an XS component.
--
<http://www.dave.org.uk>
"Perl makes the fun jobs fun
and the boring jobs bearable" - me
| [reply] [Watch: Dir/Any] |