Beefy Boxes and Bandwidth Generously Provided by pair Networks Ovid
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

RE: Re: dealing with attachments (again)

by BBQ (Curate)
on May 25, 2000 at 19:22 UTC ( [id://14850]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: dealing with attachments (again)
in thread dealing with attachments (again)

I second that. You would be increasing your performance by a great deal just by finding a clever way to keep the attachments directly on the filesystem. I would suppose that you have a user primary key and a message primary key, correct? If so, you could try doing a directory tree like:
/users/ /user1/ /attachmenta /attachmentb /attachmentc /user2/ /attachmenta /attachmentb /attachmentc
Where userx would be the user pk and attachmentx would be the attachment pk (doh!). On the downside, it would probably be nice to develop a tool or cron job to keep track of lost attachments, old stuff and so forth...

#!/home/bbq/bin/perl
# Trust no1!

Replies are listed 'Best First'.
RE: RE: Re: dealing with attachments (again)
by Punto (Scribe) on May 26, 2000 at 05:10 UTC
    Well, my main concerns were:

    - I need to be able to figure out the ammount of disk space that a user is using, in a fast and easy way.
    - There is a security problem with saving the attachments on a public_html directory.
    - (this is the most important) I'd like to have all the database in one place.

    So there is no way to send the content of a _big_ file to mysql without using a lot of ram?

    Thanks!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://14850]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.