Beefy Boxes and Bandwidth Generously Provided by pair Networks
Problems? Is your data what you think it is?
 
PerlMonks  

Re^2: csv file download size limits?

by ted.byers (Scribe)
on Nov 07, 2013 at 21:44 UTC ( #1061631=note: print w/ replies, xml ) Need Help??


in reply to Re: csv file download size limits?
in thread csv file download size limits?

Thanks Alex

Alas, though plausible at first, it turns out that that wasn't it. I modified the script in question to get the data one row at a time, and the same limit is hit. The file actually received is precisely the same that I get when I use selectall_arrayref.

In retrospect, if apache is able to use all free memory on the server, there should be no such limit. After all, there are several GB RAM available, and even a ridiculously large recordset from the DB would only be a few MB. We're talking a 64 bit Windows server here, with a 64 bit build of Apache's web server, and a 64 bit build of MySQL. If any of those server products have trouble with even a few MB of data, then there is a serious problem with it. And the perl we're using (v 5.16) is a 64 bit build, so there ought not be a problem there. But that constraint is so consistent, it must be a constraint imposed somewhere: but where do I look?

Thanks again.

Ted


Comment on Re^2: csv file download size limits?
Re^3: csv file download size limits?
by hippo (Curate) on Nov 07, 2013 at 22:23 UTC
    but where do I look?

    In the apache error log. If there's nothing there, start peppering your script with debugging statements until you track it down.

    Also look in the apache access log which will show the total size of the response body. It may be the case that the web server is sending the data fine, but something else in the network (or on the Windows box itself) is cutting it off.

      Thanks

      Using the dbug statements is something I usually do, especially with CGI programs where I have not found a viable way to step through the code. Alas, this time, they did not help

      I also usually look at the logs, both error and access, and this time, they didn't tell me anything I didn't already know. The Perl script simply stopped writing out the data once a limit of about 152 kB.

      But I have a solution, as it were. I used MySQL to output the data directly to a file. I then used IO to slurp the data (in this use case, typically about 1 MB), and write that out. This works fine, and this makes the cause of the original problem even more puzzling as clearly Apache, as it is presently configured, is capable of sending a lot more data than it was.

      Thanks again

      Ted

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: note [id://1061631]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others scrutinizing the Monastery: (6)
As of 2014-12-27 11:37 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    Is guessing a good strategy for surviving in the IT business?





    Results (177 votes), past polls