well... the easy way would be to use something like wget which
supports recursive downloads, then search it locally...
in reply to search a foreign directory
Or you could write a web spider of your own in perl using LWP
and search the pages each time (or make a local copy as with the wget).
Probably be a good idea to cache the pages locally for a while and search
them locally, then rebuild the link and go to the actual site.
of course, google and altavista have an option to search within
a specific domain, so if they are in there you could just use them :)
Update BTW, to use the domain searching in AV and google go to their Advanced Search pages