Clear questions and runnable code get the best and fastest answer |
|
PerlMonks |
Re^4: Multithreading leading to Out of Memory errorby joemaniaci (Sexton) |
on Jun 07, 2013 at 21:30 UTC ( [id://1037761]=note: print w/replies, xml ) | Need Help?? |
Well it's nice to see I wasn't going completely down the wrong path. I already removed Switch, and ParseExcel is actually the last thing done when all the threads have already been destroyed. I usually only come across 2-3 .xls files so that part is single threaded. When it comes to DBI and DBI::ODBC I tried...
inside the methods that need it, instead of...
...at the very top but the behavior was very erratic. Ironically enough I discovered this bug while getting ready to work on feeding the queue as soon as I found files with the correct extensions instead of, but I wanted to resolve this before attempting that. I also thought about creating a single subroutine to handle each file type, but the issue is that one file type is always 1kb and so it's thread would be done in seconds and then not doing anything afterwards. While other files are gargantuan. The goal was that small/medium(which are the majority) files could be handled while the bigger files were being processed over time This is how I grab all the files I need and it is only done once at the very beginning, before threads are created
Outside of that is pretty much all the file checks, making sure this line has the right number of items, bounds checks, so on and so forth. Nothing complicated, mostly simple regex stuff.
In Section
Seekers of Perl Wisdom
|
|