Beefy Boxes and Bandwidth Generously Provided by pair Networks
good chemistry is complicated,
and a little bit messy -LW
 
PerlMonks  

RE: RE (tilly) 1: package namespace manipulation

by knight (Friar)
on Nov 14, 2000 at 23:51 UTC ( [id://41631]=note: print w/replies, xml ) Need Help??


in reply to RE (tilly) 1: package namespace manipulation
in thread package namespace manipulation

For a lot of reasons that I won't bore you with, I have to "do" a number of scripts in the same package, but protect them from stomping on each other. The logical flow is:
package my_package; for my $script (@scripts) { do $script; $vars{$script} = NameSpace::save('my_package'); NameSpace::delete('my_package'); } # additional processing here for my $script (@scripts) { NameSpace::restore('my_package', $vars{$script}); # additional script-specific processing }
I don't see how you'd accomplish with local, but if you know of a way, I'm all ears.

Replies are listed 'Best First'.
RE (tilly) 3: package namespace manipulation
by tilly (Archbishop) on Nov 15, 2000 at 06:23 UTC
    I thought it was as easy as:
    local *private::package::; # time passes $saved{$some_name} = \*private::package::;
    but I was wrong.

    As for the underlying question though, your description of what you are doing sets off warning flags for me. Why not scoop out the body of the scripts into modules, turn the scripts into wrappers, and then call the module in your code? What if a script makes an assumption and does something like call exit? Why do you need to have the globals in your current package, can you redesign to not need that? If you have any control over what globals you need from the script, why not eval each script into its own package, then in your second loop alias just the globals that you need from the scripts?

    The exit question is a serious gotcha. The other suggestions are alternate ways of solving your likely problem. I have found valid uses for all of the other suggestions. The one with each getting their own package was for configuration loading - there are very few other places where I would even consider that. The idea of scooping things into modules is my first recommendation.

    Now there could be things I don't know that make all of these ideas bad. But my gut feeling is that you don't need to use your current design, and that using it you will run into further issues - it just smells like a bad hack.

      Okay, I guess I will bore you with the details... :-)

      This is for Cons, the project which takes up most of my Open Source time. Cons is a software construction utility (i.e. a substitute for make). Cons configuration files (think Makefiles) are actually Perl scripts that make function calls to list what's to be built, establish dependencies, etc. A Cons config file (a.k.a. a Perl script) can arrange for other config files to be included in the build. Sharing Perl variables between config files is controlled by explicit declarations (Cons Export and Import functions), We enforce the sharing rules via the namespace manipulation routines in the root of this thread. (Actually, it used to be through an even more intrusive mechanism...)

      Why not scoop out the body of the scripts into modules, turn the scripts into wrappers, and then call the module in your code?

      We don't control the scripts, they're "input files" written by the user.

      What if a script makes an assumption and does something like call exit?

      We exit, just like they said. If that's not what they want in their "Makefile," they fix it and re-run Cons.

      Why do you need to have the globals in your current package, can you redesign not to need that?

      Changing the package name breaks some existing configurations out there that make assumptions about the well-known package name in which they're executed. (Cons has been around almost four years, and we take backwards compatibility pretty seriously--maybe more seriously than we should sometimes...)

      If you have any control over what globals you need from the script...

      We don't control them, but don't use the globals directly for anything anyway. Anything they want the build engine to do is communicated to Cons by function calls from the defined API. We have to restore the variables, though, because they can arrange for a file to be built by executing a snippet of Perl code, and they need the symbols in force at the point they supplied the snippet.

      ...why not eval each script into its own package, then in your second loop alias just the globals that you need from the scripts?

      Actually, we just experimented with this, executing each script in its own package with an internally-generated name. This was a lot cleaner, but broke some existing configurations as described.

      Now there could be things I don't know that make all of these ideas bad.

      Not bad, just not necessarily applicable to our unusual requirements. Believe me, I appreciate the warnings and additional food for thought; we've already considered many (though not all) of these issues before arriving at the current solution.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://41631]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others browsing the Monastery: (4)
As of 2024-04-24 04:53 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found