in reply to Re: large perl module in thread large perl module
Well, this is supposed to be 'lookup on demand' system
providing lookup services to other servers.
Since it's doing it all the time, there is no point of loading the data over and over again from external storage.
The data was before in a data file (many of them actually), but that made the disk too busy.... It seems like these data files were not being cached on the OS level.
For each 10k lookups, perl had to read approx 2gb
of files. 10k lookups were taking (when executed in a loop)
35s. Now, with the .pm loaded, it takes 10s.
The server has 8GB of ram, and it's using (according to top)
only 3GB.
Re^3: large perl module
by CountZero (Bishop) on Mar 05, 2010 at 07:11 UTC
|
providing lookup services to other servers A database server does exactly that and much more efficient than a "converted" web-server. Indexing, caching, ... are all highly optimised in a good database server.
CountZero A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James
| [reply] |
Re^3: large perl module
by jrsimmon (Hermit) on Mar 04, 2010 at 22:26 UTC
|
Moving the data to a data file doesn't mean loading it every time a request comes in. As pointed out below, having all the data in your .pm at best means you have it duplicated (loaded into memory in the .pm, and in whatever variable holds it). Worse, any changes to the data require changes to your source code, which is a design no-no. | [reply] |
|