http://www.perlmonks.org?node_id=1061620


in reply to csv file download size limits?

From a quick look at your code, my first guess is that the Apache process is running out of memory, because of the way you're getting data from the database. The selectall-arrayref call is going to suck up lots of memory doing the data retrieval that way.

A better method would be to get the data, and then print it, just one row at a time.

Alex / talexb / Toronto

Thanks PJ. We owe you so much. Groklaw -- RIP -- 2003 to 2013.

Replies are listed 'Best First'.
Re^2: csv file download size limits?
by ted.byers (Monk) on Nov 07, 2013 at 21:44 UTC

    Thanks Alex

    Alas, though plausible at first, it turns out that that wasn't it. I modified the script in question to get the data one row at a time, and the same limit is hit. The file actually received is precisely the same that I get when I use selectall_arrayref.

    In retrospect, if apache is able to use all free memory on the server, there should be no such limit. After all, there are several GB RAM available, and even a ridiculously large recordset from the DB would only be a few MB. We're talking a 64 bit Windows server here, with a 64 bit build of Apache's web server, and a 64 bit build of MySQL. If any of those server products have trouble with even a few MB of data, then there is a serious problem with it. And the perl we're using (v 5.16) is a 64 bit build, so there ought not be a problem there. But that constraint is so consistent, it must be a constraint imposed somewhere: but where do I look?

    Thanks again.

    Ted

      but where do I look?

      In the apache error log. If there's nothing there, start peppering your script with debugging statements until you track it down.

      Also look in the apache access log which will show the total size of the response body. It may be the case that the web server is sending the data fine, but something else in the network (or on the Windows box itself) is cutting it off.

        Thanks

        Using the dbug statements is something I usually do, especially with CGI programs where I have not found a viable way to step through the code. Alas, this time, they did not help

        I also usually look at the logs, both error and access, and this time, they didn't tell me anything I didn't already know. The Perl script simply stopped writing out the data once a limit of about 152 kB.

        But I have a solution, as it were. I used MySQL to output the data directly to a file. I then used IO to slurp the data (in this use case, typically about 1 MB), and write that out. This works fine, and this makes the cause of the original problem even more puzzling as clearly Apache, as it is presently configured, is capable of sending a lot more data than it was.

        Thanks again

        Ted