Beefy Boxes and Bandwidth Generously Provided by pair Networks RobOMonk
There's more than one way to do things
 
PerlMonks  

Re: Re: Question of safe data passing...

by Rhandom (Curate)
on Apr 27, 2001 at 15:17 UTC ( [id://76136]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: Question of safe data passing...
in thread Question of safe data passing...

On to something here...

Instead of just storing the DBI object, make a DBI wrapper object that every time you try any method it checks to see if $0 still matches the copy that it stored in itself somewhere. If it doesn't match then it dies out. This way, you wouldn't be able to spoof the script your running on.

Er... uh.. will Storable cache a DBI object and allow you to reconnect at a later point?

my @a=qw(random brilliant braindead); print $a[rand(@a)];
  • Comment on Re: Re: Question of safe data passing...

Replies are listed 'Best First'.
Re: Re: Re: Question of safe data passing...
by lindex (Friar) on Apr 27, 2001 at 15:25 UTC

    Ahh, can't use $0 because then you could just exec the DSN wrapper with the name of a valid script and BAM you have the "frozen" DBI object.

    The DSN wrapper must find the name of its caller on its own. And it must get this information from none user corruptable data. So the idea of passing the DSN wrapper a pid and then have the wrapper check proc to make sure the pid matches a allowable script name is also out of the question.




    lindex
    /****************************/ jason@gost.net, wh@ckz.org http://jason.gost.net /*****************************/

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://76136]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.