Beefy Boxes and Bandwidth Generously Provided by pair Networks
Just another Perl shrine
 
PerlMonks  

Data Security in Perl

by Anonymous Monk
on Sep 04, 2002 at 10:34 UTC ( [id://195030]=perlquestion: print w/replies, xml ) Need Help??

Anonymous Monk has asked for the wisdom of the Perl Monks concerning the following question:

Hi, I'm creating a system that allows a webmaster to create users and databases. They can then assign what users can access what databases. There are other tools too but basically involving the scripts creating directories, new files etc. Users can also write to the databases when given access. I want to know the best layout for this.
Currently my file structure is,
/cgi-bin/scripts.pl
/databases/dbase1/temp.dat
/databases/dbase2/temp.dat ect..
/users/A.dat
/users/B.dat (each letter stores all users beginning with it along with their passwords and email addresses)

The problem I seem to have is...
1. I don't want anyone being able to just grab the files that store all my users with their details and passwords. It seems that unless I have /users set to 0777 i cant add files new files to it via my script. It is the same with /databases I want to be able to create new databases inside AND let users write to the databases. But I dont want someone just finding /databases/dbase1/temp.dat and taking the data. I was thinking htaccess but then i need to give each user their own??
I've read around and I can't seem to find anywhere that suggest a good way for doing this.
Would really appreciate some help off you clever bunch of people. Thanks, Adam

Replies are listed 'Best First'.
Re: Data Security in Perl
by zigdon (Deacon) on Sep 04, 2002 at 10:49 UTC
    Are you worried that people access the data from a web browser, or a local user accessing it via the filesystem? .htaccess can help with blocking browser access to directories, but unless users need to access their files directly (and not through the database or scripts), you could just move the whole directory tree outside the docroot for the server. Then it will not be accessible at all via a browser.

    If it's filesystem access you're worried about, just chown the whole tree to apache, and chmod it 700 - then apache (and any scripts it runs) can read and write to the tree, but no local users (unless they hack apache) will have access to it.

    -- Dan

      Thanks for that Dan. Im worried about the data being accessed from the browser. Basically, I am a simple user on an ISP (supanames). I have cgi-bin access and can run scripts off their server. They tell me that any files I make from a script are created as the owner "nobody". Is this their fault? That's a great idea with chmod 700, but if the owner is nobody I can't do anything and apparently (gotta ask ISP more) but i cant do a chown command unless i am root user. I'll try the 700 thing=)) thanks, Adam
        Yes, this is "their fault". It's a poor design on a shared host since, as you're finding, as it causes users to have to do strange things WRT permissions. It can also make it a royal pain to delete those files, since they will normally be chmod'ed 644 and chown'ed to nobody. Then you have to run a CGI script to chmod them to 666 so your user account can delete them (since CGI runs as nobody it is the only user level account that has write access to these files). If time/money/whatever aren't an issue, you might shop for a new ISP who runs Apache CGI as the actual user instead of nobody. This protects your data both from external browser viewers as well as other users on your shared host.

        The suggestion about storing these files outside the docroot for your CGI/web directory is also a good one, otherwise anyone who can guess the URL can see these files unless you take pains to change the permissions to something like 600.

        FWIW, I would be just as paranoid about access by other users of the shared host as I would by the web at large. Unless you know all those people, you have no idea what they'll do with the data in your directories.

Re: Data Security in Perl
by gryphon (Abbot) on Sep 04, 2002 at 15:46 UTC

    Greetings,

    For learning more about CGI security, check out Ovid's "Web Programming Using Perl" Course. It's quite good. Lesson Three is a good overview of security. One thing that Ovid points out is that you can't really trust Apache (via .htaccess files) to handle all the security for you.

    For what you're talking about, though, it seems that you've got a series of pseudo database files that are read by your scripts. You just want these files to be inaccessible to the public via the a URL, right? In such cases, I typically build two directories from whatever root directory I'm in. Assuming a fairly default Apache conf, /var/www/html is the root dir for the Web server files. I'll usually mkdir /var/www/hidden or something similar, then put all my pseudo database files in there. The scripts in /var/www/html or /var/www/perl can get to the "hidden" dir easily, but unless Apache is misconfigured, it'll be really difficult for an end-user to get there without reasonably significant hacking.

    gryphon
    code('Perl') || die;

    Update: Hey, cool! This is my 100th post.

Re: Data Security in Perl
by Anonymous Monk on Sep 04, 2002 at 11:28 UTC
    Your webmaster is probably going to use htaccess with Basic Access in setting up accounts, making it trivial for anyone with a sniffer to get the password and shortly afterwards an account on the box. Worrying about file protection mechanisms after that is like closing the barn door after the horses have escaped.

    But if you want to do this useless thing, then use the sudo utility (which needs to be configured) to allow the modification script to run as root. Then you can have the trappings of security, if none of the substance. (Figuring out how to set up https will go a long way towards fixing that.)

      Your right. The main problem is making sure nobody can access the database and user data from their browser by just typing domain.com/databaes/data.dat and saving it. BUT at the same time allowing logged in users to edit it AND the webmaster to create new files, and delete from it. Thanks, Adam
Re: Data Security in Perl
by Jeppe (Monk) on Sep 04, 2002 at 15:15 UTC
    I believe you can put a .htaccess file that rejects all browser requests in /databases and /users . The /cgi-bin scripts are executed in the system environment, not being restricted by the .htaccess files once they are executed.

    The .htaccess file will look something like

    Order deny, allow Deny from all
CLARIFICATION
by Anonymous Monk on Sep 04, 2002 at 11:52 UTC
    Just to clarify for any other replies, I'm just a user on an ISP called www.supanames.co.uk All I have is a cgi-bin and access to run perl scripts. Supanames give me no help what so ever when I ask questions. My scripts are being created with the plan to release it as a small piece of software that other users can download. That's why im concerned about how I'm storing all this data in a standard environment where a user only has a CGI-BIN and perl access themselves.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: perlquestion [id://195030]
Approved by rob_au
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others musing on the Monastery: (2)
As of 2024-04-25 06:24 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found