I don't really see where this is a better solution than what's been suggested. There's a lot of overhead involved in copying, not to mention the fact that you'll need extra code to clean whatever directory you're temporarily storing the files in. If the files are left in temp for any length of time, then you're running the risk of unauthorized users being able to acces the file.
In the situation I have at work, we have to verify that users are allowed to access files. We went through several ideas before settling on a CGI that takes advantage of Content-Disposition.
GuildensternNegaterd character class uber alles!
- Idea #1 - Put the file storage into web space.
- Requires some nasty messing with .htaccess files and the such.
- Inherent security issues of putting things in web space that don't belong there.
- Idea #2 - Copy each file into a temp area as it's requested and create a link on the fly.
- Aforementioned security issues
- Our files are large. The I/O overhead of copying multi-megabyte files all over the file system wasn't very attractive.
- Collision. We have files with the same name spread out over different projects. Sure, we could add numbers and such to the copied file name, but our users have told us that's not a good solution.
- Idea #3 - Use a CGI process to access files in the filesystem and send them directly to the browser.
- Quick, painless, and practically void of the problems mentioned above.