I thought it was as easy as:
local *private::package::;
# time passes
$saved{$some_name} = \*private::package::;
but I was wrong.
As for the underlying question though, your description of
what you are doing sets off warning flags for me. Why not
scoop out the body of the scripts into modules, turn the
scripts into wrappers, and then call the module in your
code? What if a script makes an assumption and does
something like call exit? Why do you need to have the
globals in your current package, can you redesign to not
need that? If you have any control over what globals you
need from the script, why not eval each script into its
own package, then in your second loop alias just the
globals that you need from the scripts?
The exit question is a serious gotcha. The other
suggestions are alternate ways of solving your likely
problem. I have found valid uses for all of the other
suggestions. The one with each getting their own package
was for configuration loading - there are very few other
places where I would even consider that. The idea of
scooping things into modules is my first recommendation.
Now there could be things I don't know that make all of
these ideas bad. But my gut feeling is that you don't
need to use your current design, and that using it you will
run into further issues - it just smells like a bad hack. | [reply] [d/l] |
Okay, I guess I will bore you with the details... :-)
This is for
Cons,
the project which takes up most of my Open Source time.
Cons is a software construction utility
(i.e. a substitute for make).
Cons configuration files (think Makefiles)
are actually Perl scripts
that make function calls
to list what's to be built,
establish dependencies, etc.
A Cons config file (a.k.a. a Perl script)
can arrange for other config files to be
included in the build.
Sharing Perl variables between
config files is controlled
by explicit declarations
(Cons Export and Import functions),
We enforce the sharing rules
via the namespace manipulation
routines in the root of this thread.
(Actually, it used to be
through an even more intrusive mechanism...)
Why not scoop out the body of the scripts into modules,
turn the scripts into wrappers,
and then call the module in your code?
We don't control the scripts,
they're "input files" written by the user.
What if a script makes an assumption
and does something like call exit?
We exit, just like they said.
If that's not what they want in their "Makefile,"
they fix it and re-run Cons.
Why do you need to have the globals
in your current package,
can you redesign not to need that?
Changing the package name breaks
some existing configurations out there
that make assumptions about the well-known
package name in which they're executed.
(Cons has been around almost four years,
and we take backwards compatibility
pretty seriously--maybe more seriously than we should
sometimes...)
If you have any control over what globals
you need from the script...
We don't control them,
but don't use the globals directly for anything anyway.
Anything they want the build engine to do is
communicated to Cons
by function calls from the defined API.
We have to restore the variables, though,
because they can arrange for a file to be built
by executing a snippet of Perl code,
and they need the symbols in force
at the point they supplied the snippet.
...why not eval each script into its own package,
then in your second loop alias just
the globals that you need from the scripts?
Actually, we just experimented with this,
executing each script in its own
package with an internally-generated name.
This was a lot cleaner,
but broke some existing configurations as described.
Now there could be things I don't know
that make all of these ideas bad.
Not bad, just not necessarily applicable to our
unusual requirements.
Believe me, I appreciate the warnings and additional
food for thought; we've already considered many
(though not all) of these
issues before arriving at the current solution.
| [reply] |