|Perl: the Markov chain saw|
I used your search engine and nothing since 1999 included all four keywords in either the text or title.
I used the following to create and send a csv file, from a specialized CGI script.
This code works perfectly as long as the CSV file is less than approximately 152 kB! Alas, if the CSV file is greater than about 152 kB, the first 152 kB is sent, and the rest discarded, so the user sees a truncated file. NB: I used "-attachment=>'data.csv'" so that the client browser would have a name to assign to the file received, as the data is coming from a SQL query rather than a real file.
I tried adding "-Content_length=>$len," to the header section, and instead of printing the rows as I iterate through them, I put the content into a really long string, '$s', used length($s) to get a value for the content length, and again, it all works perfectly as long as the csv file size is less than about 152kB, but in this case, the file is never sent at all if the file size is greater than 152 kB.
Where is the 152 kB limit coming from, and how can I over-ride that limit, if that is possible? Or do I have to actually create the file somewhere in the document root dirctory tree, and issue a redirect or forward. If I have to resort to the latter, how do I ensure that the file is deleted once the client has it, or that no one else can ses the file?
I am almost at the point of either placing a limit on the amount of data that can be requested, or putting the data into a zip archive and downloading the archive, with the best available compression, instead of the csv file, but that smells like a bad kludge, especially since the user no longer has the option of just opening the csv file in his spreadsheet software once his browser has it.