http://www.perlmonks.org?node_id=1009550


in reply to Redefine package subroutines

"LP needs to have its routines reloaded during the program's duration"

You could use Class::Unload to unload the LP package before loading it again.

That said, the entire architecture you're describing sounds weird. I'm pretty sure a solution exists using object orientation, inheritance and dependency injection to solve your underlying problem without any of this FP/LP/GP stuff, but you've not described your underlying problem in anywhere near enough detail.

"Another nice feature (although not essential) would be to allow the user NOT to specify the package name, and have the program default to FP::routine if it exists, then LP::routine if it exists, before finally defaulting to GP::routine (and returning an error if the routine doesn't exist in any package). I would also need the user to be able to over-ride this by explicitly specifying the package..."

Write all your functions as class methods. That is; functions which are called like:

Package->function(@args); # like this Package::function(@args); # not this!

That way you can create an uber package like this:

{ package AllP; use base qw( FP LP GP ); }

Then calling AllP->function will automatically search FP, LP and GP in that order.

perl -E'sub Monkey::do{say$_,for@_,do{($monkey=[caller(0)]->[3])=~s{::}{ }and$monkey}}"Monkey say"->Monkey::do'

Replies are listed 'Best First'.
Re^2: Redefine package subroutines
by tobyink (Canon) on Dec 19, 2012 at 12:15 UTC

    OK, let's have a guess at the kind of thing you're doing. It probably differs in the details.

    You have a script that needs to process a large directory tree; perhaps to perform backups or some other automated process. Some directories contain large text log files, which should be compressed before backing up, and once they're backed up they're never going to be modified so this knowledge allows us to shortcut a lot of processing. Other directories contain just images; these need to be backed up by uploading them to Flickr. And so on.

    The way I'd approach this might be to keep a set of modules like this:

    { package Backup::General; sub backup_file { my ($class, $file) = @_; ... } sub backup_dir { my ($class, $dir) = @_; opendir my $d, $dir; while (readdir $d) { next if /^\./ or -d; $class->backup_file($_); } } ... } { package Backup::ImageDir; use parent 'Backup::General'; sub backup_file { my ($class, $file) = @_; $class->upload_flickr($file) if $file =~ /\.jpeg$/i; $class->SUPER::backup_file($file) } sub upload_flickr { ... } ...; } { package Backup::LogDir; use parent 'Backup::General'; sub backup_dir {...} ... }

    Then each directory being backed up could contain a file backup.ini which looked something like this:

    backup_class = "Backup::ImageDir"
    

    My backup script would walk through the directories and for each:

    use Config::Tiny; use Module::Runtime qw(use_package); my $config = Config::Tiny->read("$dir/backup.ini"); my $class = $config->{_}{backup_class} || 'Backup::General'; use_package($class)->backup_dir($dir);

    That way, each directory decides for itself what module will handle its backups. There's no central list of backup modules; no limit to the number of different modules to choose from; the modules are loaded on demand as required.

    This seems to be the sort of thing you want; some sort of general behaviour which can be overridden locally based on factors determined at runtime.

    perl -E'sub Monkey::do{say$_,for@_,do{($monkey=[caller(0)]->[3])=~s{::}{ }and$monkey}}"Monkey say"->Monkey::do'