Beefy Boxes and Bandwidth Generously Provided by pair Networks
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

Re: large perl module

by jrsimmon (Hermit)
on Mar 04, 2010 at 21:31 UTC ( [id://826821]=note: print w/replies, xml ) Need Help??


in reply to large perl module

While I suspect your statement "this is the fastest way" woulnd't hold up under heavy scrutiny, if you have enough system resources and it's working, what's the concern?

You could always move the data to a data file (better design), but unless you put it in a database or some other "lookup on demand" solution, you'll still end up with the large hash and it won't really make a difference.

Replies are listed 'Best First'.
Re^2: large perl module
by minek (Novice) on Mar 04, 2010 at 22:09 UTC
    Well, this is supposed to be 'lookup on demand' system providing lookup services to other servers.
    Since it's doing it all the time, there is no point of loading the data over and over again from external storage.
    The data was before in a data file (many of them actually), but that made the disk too busy.... It seems like these data files were not being cached on the OS level.
    For each 10k lookups, perl had to read approx 2gb of files. 10k lookups were taking (when executed in a loop) 35s. Now, with the .pm loaded, it takes 10s.
    The server has 8GB of ram, and it's using (according to top) only 3GB.
      providing lookup services to other servers
      A database server does exactly that and much more efficient than a "converted" web-server. Indexing, caching, ... are all highly optimised in a good database server.

      CountZero

      A program should be light and agile, its subroutines connected like a string of pearls. The spirit and intent of the program should be retained throughout. There should be neither too little or too much, neither needless loops nor useless variables, neither lack of structure nor overwhelming rigidity." - The Tao of Programming, 4.1 - Geoffrey James

      Moving the data to a data file doesn't mean loading it every time a request comes in. As pointed out below, having all the data in your .pm at best means you have it duplicated (loaded into memory in the .pm, and in whatever variable holds it). Worse, any changes to the data require changes to your source code, which is a design no-no.

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://826821]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others making s'mores by the fire in the courtyard of the Monastery: (4)
As of 2024-04-24 12:08 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found