|Don't ask to ask, just ask|
safe file content collector over the netby iaw4 (Monk)
|on Jun 27, 2013 at 23:33 UTC||Need Help??|
iaw4 has asked for the
wisdom of the Perl Monks concerning the following question:
dear perl experts:
I want to write one perl script that collects some data from a client machine and another perl script that collects it on my server. basically, clients run a benchmark and submit their results. the first perl script is not trusted (clients can try to be malicious), the second is trusted (it sits on my web server. only I can change it. it needs to take some precautions---such as truncating what it stores to avoid overflows). more or less, the server script records each submission under a timestamp in a temporary directory, and if there are timestamp collision, just delays recording it. I can analyze the contents of the submitted files later.
first, I tried to do this with wget on the client side and a cgi script on the server side. it was painful and did not work. I do not want to use curl, because it is not installed by default on linux ubuntu systems.
but thinking about it now, maybe the web is a stupid way to go about it to begin with. maybe I should just use sockets?!? or ftp?!? or something else? or a perl-ish solution from cpan? (I don't think I want to run a full ftp daemon for this---too many security issues in the long run.)
so, sockets? other common solutions? recommendations appreciated.