$ vi my-application.pl
$ mkdir lib
$ mkdir lib/File
$ cp /usr/lib/perl5/site_perl/5.6.0/File/Tail.pm lib/File
$ tar cf - . | gzip >my-application-archive.tar.gz
Unless a module builds itself in an architecture-specific way (note the lack of an architecture-specific directory in the library path), and it's safe to assume the users are all using the same version of Perl (File::Tail goes into 5.6.0, but who knows, it might run fine under 5.005*), copy the library to a local directory and 'use lib
' to reference that directory. Now, pack that directory into the archive you use to distribute your application. Viola! You now have a self-contained application that uses non-core Perl modules. I would assume this could easily be adapted for Win32 (using .zip files) as well.
If you were planning ahead, you would build the modules using your local lib directory ahead of time, and be sure all of the pre-requisites made their way in there as well.
Is there something inherently wrong with this approach? This is what I always mean when I say it's usually trivial to extend an existing distribution mechanism to include the Perl modules that it needs.
In addition, if your application is going to be used on a multitude of Perl versions and platforms, carry your package to each of a set of sample systems that it will run on and reinstall your modules. They will all neatly go into their appropriate architecture- and version-specific directories, which will be all-but ignored by architectures and versions that don't use them. Pack the whole thing up again and carry it to your next system, or repeat the process on all of the other Perl versions installed on that system that you'll need.