Beefy Boxes and Bandwidth Generously Provided by pair Networks
"be consistent"
 
PerlMonks  

Re^2: A better, more powerful fileglob?

by CountZero (Bishop)
on Feb 15, 2007 at 20:29 UTC ( [id://600296]=note: print w/replies, xml ) Need Help??


in reply to Re: A better, more powerful fileglob?
in thread A better, more powerful fileglob?

One could of course read in the whole filesystem below the toplevel directory you intend to use, turn it into an XML-structure and then apply XPath rules to it.

Something says me it will not be very efficient ...

CountZero

"If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

  • Comment on Re^2: A better, more powerful fileglob?

Replies are listed 'Best First'.
Re^3: A better, more powerful fileglob?
by ikegami (Patriarch) on Feb 15, 2007 at 20:40 UTC

    Something says me it will not be very efficient ...

    The particular implementation you mentioned, serializing the file system tree into an XML doc then deserializing it into a XML tree, would indeed be inefficient.

    However, XPath can be applied to tree structures (such as a file system) in an efficient manner. It was designed to do just that.

      Thanks. I do some reading up on XPath then (it is several years ago I last used it). I seem to have had the wrong impression that it needed XML to work upon.

      CountZero

      "If you have four groups working on a compiler, you'll get a 4-pass compiler." - Conway's Law

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://600296]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (2)
As of 2026-04-20 00:07 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.