http://www.perlmonks.org?node_id=940279


in reply to Re^2: perlbug: seekdir/readdir broken on win32 on 5.008009, 5.012002, 5.014001
in thread perlbug: seekdir/readdir broken on win32 on 5.008009, 5.012002, 5.014001

I am curious about this function (seekdir).
It just sounds like a "solution in search of a problem"?

I do most of my programming on Windows, but some on *nix. But even so, how many directory entries would one need in order for this type of a function to be "worthwhile" in terms of performance or memory? 1K? 10K? 50K? My mind boggles at directories of such sizes.

  • Comment on Re^3: perlbug: seekdir/readdir broken on win32 on 5.008009, 5.012002, 5.014001

Replies are listed 'Best First'.
Re^4: perlbug: seekdir/readdir broken on win32 on 5.008009, 5.012002, 5.014001
by Anonymous Monk on Nov 27, 2011 at 20:59 UTC

    You can bet graphical file explorers make use of it

    And you can bet they have been using it since 64kb of storage was plenty (or whatever the saying is)

      A graphical explorer wouldn't use this. Not now, not in the days of text-based windowing systems. They need to load the entire directory into memory in order to sort the file names.

        A graphical explorer wouldn't use this. Not now, not in the days of text-based windowing systems. They need to load the entire directory into memory in order to sort the file names.

        Sure, whatever you say, you know all implementation of all explorers by heart, and you can't sort data unless its all in memory at the same time

Re^4: perlbug: seekdir/readdir broken on win32 on 5.008009, 5.012002, 5.014001
by patcat88 (Deacon) on Nov 29, 2011 at 18:17 UTC