Beefy Boxes and Bandwidth Generously Provided by pair Networks
Perl Monk, Perl Meditation
 
PerlMonks  

Automatic packaging of multiple perl-modules in a major bundle

by poulhs (Beadle)
on Nov 11, 2010 at 22:59 UTC ( [id://870948]=perlquestion: print w/replies, xml ) Need Help??

poulhs has asked for the wisdom of the Perl Monks concerning the following question:

Hi

I am responsible for a perl-installation, including modules and best-practices.
This perl is specific for our operational scripts (3000+ servers, several OS's).
We do automatic testing (with hudson continuous integration) and packing (.rpm, .pkg, .dep and .msi).

I intend to select a bunch of perl-modules, create a Bundle:: (or rather, I assume it will be a Task:: ) and create a single package (or rather: one for each architecture/os). This task/bundle should be automatically rebuild with the latest and greatest (but not installed unless all unit-test and other tests succeed).

I have done something like this several years ago, using CPAN and a Bundle::, although it was done manually.
Things have evolved since, so I am looking into cpanp and Module::Install, something like: cpanp i Task Module::Install Module::AutoInstall followed by perl Makefile.PL && make && make test && make DESTDIR=/wrk/OS/ARCH/... install; with something like:

use inc::Module::Install; include 'Module::AutoInstall'; requires 'DateTime' => 0; requires 'Task::Moose' => 0; requires 'Log::Log4perl' => 0; # ... auto_install(); WriteAll;
(all with proper settings of PATH, PERL5LIB, DESTDIR and friends).

One major question: Is Module::Install a fair solution, or do you recommend other modules or ways of doing this?

  • I know I need to check out INSTALL/README for whatever I drag along, but that would be much easier, when I know what has chosen indirectly.
  • I also intend to collect all ./t/ directories for a complete master unit-test.

I'm looking for hints/ideas and experiences, as I can find no "best-practices" nor any "HOWTO" on this topic.

Any constructive feed-back (and some of the destructive too) will be very appreciated!

Replies are listed 'Best First'.
Re: Automatic packaging of multiple perl-modules in a major bundle
by afoken (Chancellor) on Nov 12, 2010 at 13:24 UTC

    Would autobundle help?

    Alexander

    --
    Today I will gladly share my knowledge and experience, for there are no sweeter words than "I told you so". ;-)

      autobundle is very usefull - it will give me a list of all installed modules, so I can check out the README's (and see what is installed). It doesn't actual help with the installation in the case of prerequisites changes, but it was what I did before, and what I may end up with this time.

      As mentioned, I've been looking into Module::Install which may or may not be the way. Right now, I'm trying out a loo*ng cpanp i ... command line, and that seems feasible (except for the usual oddities that always shows up and needs digging).

Re: Automatic packaging of multiple perl-modules in a major bundle
by aquarium (Curate) on Nov 12, 2010 at 02:22 UTC
    a previous company i worked for provided one install/upgrade self-extracting binary for unix (all flavors) and another for windows server. all the custom provided perl modules were cross-platform and deployed during install or upgrade into our main software directory, that is kept separate from perl installation itself. the many perl programs/scripts all followed a standard to push the custom perl module directory onto @INC, before any "use" statements. this minimizes need to muck about with system environment. we never made use of custom modules that are not part of the standard supplied (proprietary build) perl that would require XS/binary code compiled for separate platforms.
    at some stage a proprietary (but fully scripted) patching system was also developed, to deliver small but important fixes between major upgrades. naturally there's plenty of sanity code there to make sure it does the right thing or it stops..and you have a full trace of all that was done to that point, and an automated prior backup.
    i'm not entirely certain that absolutely everything was best practice..but these systems generally faired well, and without saying figures, if i sold the system and kept the money for myself, i could live without working for life.
    all the perl code was structured in certain ways. the code was easy to read/develop/maintain, without any OO or other fancy code that could possibly work differently across platforms. years ago DBI was dropped as a database access layer from perl, as it was emerging that some bits worked differently on different platforms. a proprietary kind of DBI was established that produced same results across the platforms and the compatible DBs. quite different to SQL though. every command that accessed the database, ended up being a set of internal commands that a central API server processed and logged...be it the web interface, a staff interface, or data access through perl scripts.
    quite clever really....hope this helps provoke some thought regards system structure.
    the hardest line to type correctly is: stty erase ^H

      Thanks for the response, but the problem is the build-process, not the packaging and distribution. The main question is how to do rebuilding very frequently with little manual intervention

        you probably can't change mid-stream now, but my point was that because the custom modules sit separately and are architecture agnostic(?), we had merely a tar archive of the custom modules. so didn't need to make .msi, .dep, .rpm, etc.
        as for the main embedded perl distribution. that was also essentially a .tar archive, with a little bit of installation script that worked across all platforms...because on windows we had cygwin. All in all, for the server side of things for install or major upgrade there was one package for linux, one for unix (only some internal differences to linux), and windows. But all of these followed same pattern of using least common denominator (e.g. tar) to achieve desired outcomes. The install/upgrade script was exactly the same for all platforms. the only real difference between the distributions was that because the install/upgrade couldn't assume cygwin & perl were installed, so for windows these were bootstrapped into the .exe. also the data api commands and server daemons were pre-compiled, so this /bin directory was different for linux/unix/windows.
        the hardest line to type correctly is: stty erase ^H
SUMMARY: Automatic packaging of multiple perl-modules in a major bundle
by poulhs (Beadle) on Dec 01, 2010 at 22:56 UTC

    I did not find a solution to this problem, but I guess that the reason is that there are too many modules which requires too much interaction (as they are asking questions, don't compile cleanly on Solaris, requires GNU-tar or the like).

    I did end up with a long list of modules to install, a perl-wrapper setting up the environment and doing iterative cpanp installs, with some helper-methods (e.g. Net::SSLeay asks Do you want to run external tests?, the helper answers yes\n ).

    I will elaborate on this script, as I hope for automatic installation of the same bunch of modules on several different platforms, but for now I will have to settle with cpanp -o followed by a number of cpanp -i.

    thanks

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://870948]
Approved by Corion
Front-paged by Corion
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others lurking in the Monastery: (5)
As of 2024-04-19 20:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found