Just another Perl shrine | |
PerlMonks |
[Solved] Perl with Swig C module not releasing memoryby megaframe (Novice) |
on May 10, 2013 at 17:48 UTC ( [id://1032999]=perlquestion: print w/replies, xml ) | Need Help?? |
megaframe has asked for the wisdom of the Perl Monks concerning the following question: I have a some C code that's malloc'ing a bunch of memory. Using swig I created some simple perl bindings and call out this function in perl. I also have a memory free block in the C code. When I compile and run just the C code with valgrind, their's no issues. It all gets created and free'd. When I use it in perl and call the destructor it never actually releases the memory. To test this I wrote up a simple test case of what I was doing. C code, lib.h
C code, lib.c
swig binding, lib.i
Perl code
Compiling/Running
Is their something else I should be doing to get the .so to release this memory? UPDATE: So it seems the free is working but its releasing the memory back to perl and not to the system. I'm consuming some 40G of memory in the tool I was using in threads. Safest solution was to fork() and exit when operation was complete. Only other thing would have been to reuse the memory in another thread. If I'm wrong and it should go back to system let me know. UPDATE 2: Found my issue. In the original code I'm wrapping around it uses a single malloc for this giant data series (in the multiple of gigabytes). I can't do that as I don't know the size of the data till I'm done loading it. So I used a linked list to dynamically allocate the space as the data is fed in from perl. These mallocs are small so glibc isn't releasing them back to the system even when several gigabytes are free'd. To fix the issue I added "#include <malloc.h>" to the header and "malloc_trim(0)" to the destroy call.
Back to
Seekers of Perl Wisdom
|
|