|laziness, impatience, and hubris|
Re: Reinventing wheels based on bad benchmarksby t0mas (Priest)
|on Jan 10, 2003 at 08:23 UTC||Need Help??|
I'm not surprised.
If you read my post carefully you'll note that I wrote On Linux it depended on when the regexp was evaluated. If I put it before -f, it performed better than if I put it after..
I've experimented a lot with this issue before making the post, and I've experimented a lot before making my original post which caused so much debate.
The file stat has no _significant_ effect on the benchmark. No major impact as you say.
You can try it yourself!
The impact it have is on the first run only, since the second run is read from some file cache and become very in-expensive.
I re-ran the benchmark today (my current box is a Pentium 1000, Windows 2000 with perl v5.6.1 built for MSWin32-x86-multi-thread) and included a find sub with no -f at all (the test3), hitting 1250 files:
test1: 42 wallclock secs ( 9.21 usr + 31.04 sys = 40.26 CPU)
test2: 52 wallclock secs (13.48 usr + 34.29 sys = 47.77 CPU)
test3: 51 wallclock secs (13.14 usr + 34.33 sys = 47.47 CPU)
I think the decicion to reinvent or not (in this case), depends on wether 4-6% is important or not. If 4-6% speed gain makes your program meet the specifications and fail otherwise, what then?.