mark4 has asked for the wisdom of the Perl Monks concerning the following question:
I have a program that has been crashing perl (5.8.8) for about 3 years now.
I FINALLY got around to debugging it.
I looked and looked and I can't see anything wrong with my code. Orininally I thought this was promlem with running out of memory Because it will only crash if the data base it is working with is large. The system I am running this on is a Windows server 2008 Enterprise with 48BG installed. See the line(s) below with "# IT CRASHES SOMEWHERE IN HERE....." I also show a small output from the run.
if ($sort_dirs) {
my @tmp_1;
my @tmp_2;
my @tmp_3;
my @tmp_4;
my @tmp_5;
my @tmp_6;
my $void;
print "Sorting directories...\n";
for ($i = 0; $i <= $last_dir; $i++) {
#printf ("sort_dirs_debug1: %5d fp: %5d, lp: %5d %s\n", $i
+, $first_pointer[$i], $last_pointer[$i], $the_dir[$i]);
#define sort order
#printf ("%s\n", $the_dir[$i]);
#printf ("%s\n", uenc_path_file($the_dir[$i]));
$tmp_1[$i] = join ('"|'
, $deleted_file[$i]
, uenc_path_file($the_dir[$i])
, $the_dir[$i]
, $first_pointer[$i]
, $last_pointer[$i]
, $has_subdir[$i]
, $has_iden[$i]
);
}
@tmp_2 = sort { lc($a) cmp lc($b) } @tmp_1;
print "Putting sorted list back...\n";
for($i = 0; $i <= $last_dir; $i++) {
(
$deleted_file[$i]
, $void
, $the_dir[$i]
, $first_pointer[$i]
, $last_pointer[$i]
, $has_subdir[$i]
, $has_iden[$i]
) = split (/\"\|/, $tmp_2[$i], 7);
#printf ("sort_dirs_debug2: %5d fp: %5d, lp: %5d %s\n", $i
+, $first_pointer[$i], $last_pointer[$i], $the_dir[$i]);
}
# This just makes the file list sequential with the sorted dir
+ectories.
print "Starting full file sort...\n";
$i = 0;
for ($dir_pointer = 0; $dir_pointer <= $last_dir; $dir_pointer
+++) {
print "i:$i, dir:$dir_pointer $first_pointer[$dir_pointer]
+ $last_pointer[$dir_pointer]\n";
$start = $first_pointer[$dir_pointer];
$first_pointer[$dir_pointer] = $i;
for($file_pointer = $start; $file_pointer <= $last_pointer
+[$dir_pointer]; $file_pointer++) {
$tmp_1[$i] = $uenc_file_name[$file_pointer]; # IT CRAS
+HES SOMEWHERE IN HERE.....
$tmp_2[$i] = $dir_and_file_name[$file_pointer];
$tmp_3[$i] = $date_time_mod[$file_pointer];
$tmp_4[$i] = $file_size[$file_pointer];
$tmp_5[$i] = $encpd[$file_pointer];
$tmp_6[$i] = $deleted_file[$file_pointer];
$tmp_7[$i] = $has_iden[$file_pointer];
$i++;
}
$last_pointer[$dir_pointer] = $i - 1;
}
print "Putting full file sort back...\n";
for($i = 0; $i <= $last_file; $i++) {
$uenc_file_name[$i] = $tmp_1[$i];
$dir_and_file_name[$i] = $tmp_2[$i];
$date_time_mod[$i] = $tmp_3[$i];
$file_size[$i] = $tmp_4[$i];
$encpd[$i] = $tmp_5[$i];
$deleted_file[$i] = $tmp_6[$i];
$has_iden[$i] = $tmp_7[$i];
$finf_valid[$i] = 0;
}
}
# for($i = 0; $i <= $last_dir; $i++) {
# printf ("debug4: %5d fp: %5d, lp: %5d %s\n", $i, $first_pointe
+r[$i], $last_pointer[$i], $the_dir[$i]);
# }
print "Exiting get files...\n";
}
program output:
......
i:1539990, dir:1111 1768215 1768216
i:1539992, dir:1112 1422703 1422702
i:1539992, dir:1113 1768217 1768217
i:1539993, dir:1114 1768218 1768218
i:1539994, dir:1115 1768219 1768219
i:1539995, dir:1116 1768220 1768220
i:1539996, dir:1117 1142520 1142535
i:1540012, dir:1118 1140894 1140925
i:1540044, dir:1119 1140926 1140940
i:1540059, dir:1120 1142370 1142399
i:1540089, dir:1121 1142400 1142519
i:1540209, dir:1122 1358128 1358218
i:1540300, dir:1123 1358064 1358127
i:1540364, dir:1124 1347195 1358063
<then crash>
Problem signature:
Problem signature:
Problem Event Name: APPCRASH
Application Name: perl.exe
Application Version: 5.8.8.820
Application Timestamp: 45b6a114
Fault Module Name: perl58.dll
Fault Module Version: 5.8.8.820
Fault Module Timestamp: 45b6a113
Exception Code: c0000005
Exception Offset: 00085bc1
OS Version: 6.1.7600.2.0.0.274.10
Locale ID: 1033
Additional Information 1: f538
Additional Information 2: f538d60ae007f756c6454955fe93e7d0
Additional Information 3: 24d2
Additional Information 4: 24d2d8331230585cafa9b0f2f2190f63
Read our privacy statement online:
http://go.microsoft.com/fwlink/?linkid=104288&clcid=0x0409
If the online privacy statement is not available, please read our priv
+acy statement offline:
C:\windows\system32\en-US\erofflps.txt
Re: I can crash perl
by ww (Archbishop) on Mar 29, 2015 at 20:27 UTC
|
and "crashes" means (more precisely, please) what ?
- blue screen of death
- 'puter hangs forever and ever, world without end
- some crash, error or warning data (beyond what you (rightly: ++) included)
or,
- in the words on a long-gone, less-then-revered penitent, "shat blue flames*1"
... and, yes, this is a serious question, despite what's intended to be a humorous tone.
Update: *1 Monsignor MidLifeXis points out that "less than revered" might, depending on your reading, be less appropriate than "IT comedy gold ;-)" and gave me the link, My computer broked down after perl install, (which also reminds me that I misquoted the original by interpolating "blue" into the phrase.)
| [reply] [d/l] |
|
@deleted_file ----- bit flag: 1 or 0
@uenc_file_name -- string, just filename.ext
@dir_and_file_name- full path to file:
like "c:\user\mark\adir\anotherdir\file.ext"
@date_time_mod -- 32 bit number (or 64?)
@file_size ------ 32 bit number (or 64?)
@encpd ------------ bit flag: 1 or 0)
@has_iden --------- bit flag: 1 or 0)
For each directory it stors:
@deleted_file[$i] --- bit 1 or 0
uenc_path_file ------ string, Same lenghth as @the_dir
@the_dir[$i] -------- String like c:\users\mark\adir\"
@first_pointer[$i] -- Porinter to the above list for the
location of the first file in this dir
@last_pointer[$i] -- Pointer .... of the last file
@has_subdir[$i] ---- bit 1 or 0
@has_iden[$i] ------ bit 1 or 0
In this instance, the number of files (entries) in the array is: 1776655
The number of directories is: 2105
So OVER ESTIMATING say 256 bytes per entry. With 1,776,655 is 454,823,680. Then we have 2105 * 256 = 538,880
so that brings us to ~500 Megabyts. Now times 4 because I use 4 different variables to store the database in so GRAND TOTAL IS: 2GB. (Wow, conicedence? Is there a cut-off of 2GB virtual memory with Windows?) As I said, there is 48GB Installed. I don't know how efficent perl is at using memory?
| [reply] [d/l] [select] |
|
| [reply] |
|
Re: I can crash perl
by frozenwithjoy (Priest) on Mar 29, 2015 at 19:48 UTC
|
You say that this happens when the database is large. How large is large? Are you slurping in a bunch of data and your OS kills the script because it is using too much memory? | [reply] |
Re: I can crash perl
by ikegami (Patriarch) on Mar 29, 2015 at 22:47 UTC
|
5.8.8 is ancient, and it's not even the latest 5.8. What is it you hope to accomplish?
| [reply] |
|
I can update perl.
I am of the mindset "If it's not broken, don't update it". Now, It seems broken so I am willing to update.
What ActivePerl release is most stable? Any sugestions?
I have ActivePerl-5.16.3.1604 downloaded and installed on one puter. It went "ok" (a few problems) but I am willing to install this on the server now. Do you recommend another version?
What am I trying to accomplish? 1. Understand why the program stops working (I like to root-cause things) and 2. Once 1. is answered re-write the program so it won't crash. Now I am working on making it more effeciant with memory. I am currently not having any luck with "undef <variable>" sigh... If you mean at a higher level "What am I trying to accomplish", without going into a lot of detail, I am sorting a two structures in alphanumbric sequence and I need to keep all the file pointers in the directory structure "not broken". I uses 4 variables to do this (i.e. causing me to use 4X the data base size) Now I am trying to reduce the number of variables I need. will "undef <variabe>" re-claim un-used memory? It seems that it is not.
| [reply] |
|
| [reply] [d/l] |
|
|
Re: I can crash perl
by QM (Parson) on Mar 30, 2015 at 08:06 UTC
|
Have you checked memory usage near the time of the crash? Does it start disk swapping sometime before the crash?
If it's a memory issue, you'll have to decide if you need everything "available" memory at the same time. There are a number of ways to use disk as an extension, all of them slower. For instance, use files instead of the large arrays, and read/write them as needed, with some care. Or tie them. Or something like DBM::Deep.
-QM
--
Quantum Mechanics: The dreams stuff is made of
| [reply] |
Re: I can crash perl
by Jenda (Abbot) on Mar 30, 2015 at 13:02 UTC
|
my @tmp_1;
my @tmp_2;
my @tmp_3;
my @tmp_4;
my @tmp_5;
my @tmp_6;
Are you kidding me?!? If someone under my command turned in this kind of code, he'd get it thrown on his head and would not leave the office until he gets it rewritten.
Jenda
Enoch was right!
Enjoy the last years of Rome.
| [reply] [d/l] |
|
Furthermore, I just noticed that @tmp_7 is used in the code further down, but not declared with the others. I think I speak for everyone when I say I want to know why @tmp_7 is so special. Enquiring minds want to know. I want to know!
| [reply] [d/l] [select] |
|
I'm a hardware engineer. I never learneded about programing. :) so be nice. :)
And yes, I know learnded is not a word
| [reply] |
|
for($i = 0; $i <= $last_dir; $i++) {
(
$deleted_file[$i]
, $void
, $the_dir[$i]
, $first_pointer[$i]
, $last_pointer[$i]
, $has_subdir[$i]
, $has_iden[$i]
) = split (/\"\|/, $tmp_2[$i], 7);
#printf ("sort_dirs_debug2: %5d fp: %5d, lp: %5d %s\n", $i
+, $first_pointer[$i], $last_pointer[$i], $the_dir[$i]);
}
something like
for($i = 0; $i <= $last_dir; $i++) {
@{$data[i]}(qw(deleted_file void the_dir first_pointer las
+t_pointer has_subdir has_iden))
= split (/\"\|/, $tmp_2[$i], 7);
#printf ("sort_dirs_debug2: %5d fp: %5d, lp: %5d %s\n", $i
+, $first_pointer[$i], $last_pointer[$i], $the_dir[$i]);
}
or
for($i = 0; $i <= $last_dir; $i++) {
($data[i]{deleted_file}, $void, $data[i]{the_dir}, $data[i]{first_po
+inter},
$data[i]{last_pointer}, $data[i]{has_subdir}, $data[i]{has_iden})
= split (/\"\|/, $tmp_2[$i], 7);
#printf ("sort_dirs_debug2: %5d fp: %5d, lp: %5d %s\n", $i
+, $first_pointer[$i], $last_pointer[$i], $the_dir[$i]);
}
Jenda
Enoch was right!
Enjoy the last years of Rome.
| [reply] [d/l] [select] |
|
tmp_7 seems to be a typo. It should have been declared with the others.
I am putting on my thinking cap. I am going to re-write this to be more efficient with memory. I have already made it more memory efficient (A little more) and now it doesn't crash (i.e. run out of memory, if that is in fact what's happening). I need to learn how to use hash tables (if that's even the correct term). I think it's time.
Just so ya-all know where I was coming from, I thought using temp variables would allow me to get memory back after I was done with them. (i.e. undef @tmp_1). Yes I know, peek memory would still be an issue.
If anyone wants to give me some guidance on how to go about re-writing this I would accept it.
| [reply] |
|
|
Re: I can crash perl
by RichardK (Parson) on Mar 30, 2015 at 12:38 UTC
|
Maybe you should stop copying so much data around, you're storing multiple copies and doing far too much work -- IMHO :) If you stored your info in a hash then you could just sort a list of the keys into the order you require. see perldata and the data structures cookbook perldsc for some guidance.
| [reply] |
|
|