in reply to Using large result sets from a database
Tough problem.
You did not say which database you are using. Postgres will let you specify a limit and an offset to the query so you can reissue the query but get only the next set of results (assuming nothing has changed). I believe mysql has similar features. And you can do it in Oracle but it gets very ugly. So if you want to restrict yourself to standard SQL you are out of luck on the DB side.
On the server side you have more control, but it is still pretty ugly. Are you using any particular framework (mason, mod_perl, ...) on your server side? If you have persistent server side state you can get the whole dataset the first time, bung it into your state and then paginate it for the user when they ask for it. This avoids problems where the data changes underneath you since you will just page through a static view but your data can become stale. It is cheaper if you expect your users to look at most of the data, but if they are only going to look at a page or two, it gets expensive. Plus, you then have to clean up the state sometime, and deciding when is tricky.
What I do in this case is to reissue the query each time and loop through the results only storing the ones I care about:
Which gets pricey if they plan on looking at all of the results, but I have hundreds of thousands to millions of results to show (in some cases). So I can't slurp them to my side and in most cases they only want to see a page or two.my $first_row = 500; my $window_size = 50; my $location = 1; while (my ( $col1, $col2) = $sth->fetchrow_array() ) { next if $location < $first_row; last if $location >= $first_row + $window_size; # Do whatever I need to do with the data } continue { $location++; }
-ben
|
---|
Replies are listed 'Best First'. | |
---|---|
Re: Re: Using large result sets from a database
by MrCromeDome (Deacon) on May 09, 2001 at 19:05 UTC | |
by hackmare (Pilgrim) on Jun 07, 2001 at 20:44 UTC |