in reply to Re: Tracking Memory Leaks in thread Tracking Memory Leaks
Heh, well, we do all those things, but they should be controlled.
- Yes, in various places, but they should all be limited in one way or another (either they go out of scope at various points or they only have a limited number of elements in them).
- We do have one module that uses circular references, but using Weakref, that shouldn't be a problem.
- Yes, scoping on the project is very tight.
- This is the thing that I'm worried about. We're constantly loading new classes, dymanically created from database information. Now, theoretically, we have a finite number of classes, so the memory usage from loading these things should be finite itself (and that limit should be hit rather quickly). However, I'm not positive this is the case. Anyway know much about the finer points of this?
- No.
Thanks for the help.
Re: Re: Re: Tracking Memory Leaks
by dragonchild (Archbishop) on Aug 15, 2001 at 02:19 UTC
|
| [reply] |
|
How does the interpreter determine whether a lexical is no longer "used"? You have to explicitly set it to undef? That seems doesn't seem like too much a "feature". So my memory usage won't really top out until all my functions have been called and all my variables used, is that correct?
I didn't know that about AUTOLOAD(). What do you mean by 'allocate additional memory every time'? Does that mean that every time I effectively call AUTOLOAD() (In my case, once for each undeclared function as I use AUTOLOAD() to then declare the function), the interpreter allocates a chunk of memory, or just that everytime I load a module with an AUTOLOAD() in it, it will allocate a larger chunk of memory than it would for a regular module?
I know I miss out on compile-time optimizations for the system, but do those optimizations involve the re-use of resources, as opposed to the allocation of resources (which I know they involve)? I would expect that resource re-use would be a function of the running system, not the compile-time optimizations. If perl allocates memory for eval'd statements and AUTOLOAD()ed subroutines and then doesn't re-use it, that sounds like a pretty serious issue.
| [reply] |
|
I don't know what exactly dragonchild was referring to, except that if you create methods or classes on the fly, that's going to consume memory. No more so than if you'd put the same definitions in a .pm file and use'd it, though. I don't think AUTOLOAD or eval("string") themselves leak memory. And I don't know what "compile time optimizations" he was referring to; eval is as much compile time as what you get when you load your main program or use/require'd modules.
| [reply] |
|
Re: Re: Re: Tracking Memory Leaks
by bikeNomad (Priest) on Aug 15, 2001 at 20:47 UTC
|
We're constantly loading new classes, dymanically created from database information. Now, theoretically, we have a finite number of classes, so the memory usage from loading these things should be finite itself (and that limit should be hit rather quickly). However, I'm not positive this is the case. Anyway know much about the finer points of this?
Are you cleaning up the symbol tables after you're done using these new classes, or are they being re-used? We found in Class::Prototyped that each new class takes up around 1.5-2K of memory; more (of course) with methods. You can clean up the symbol table when you're done with code like this (assumes $package does not contain '::'): no strict 'refs';
foreach my $key ( keys %{"$package\::"} ) {
delete ${"$package\::"}{$key};
}
# this only works because we're not a multi-level package:
delete( $main::{"$package\::"} );
| [reply] [d/l] |
|
|