Bukowski has asked for the wisdom of the Perl Monks concerning the following question:

Hello brother Monks,

I have just been approached with a question - someone has asked me if its possible to display the structure of a web pages linkage graphically, ie producing a graphical representation of a web spider/crawler output.

I have googled extensively, but I cannot find an appropriate solution, and he is looking for a Perl solution to this problem - is anyone aware of any modules that might be appropriate? Or even modules that could help him build his own application?

I imagine such a thing must exist, but maybe my search skills are eluding me on a long, cold, wet Friday afternoon.

Thanks in anticipation of your wisdom.

Bukowski - aka Dan (dcs@black.hole-in-the.net)
"Coffee for the mind, Pizza for the body, Sushi for the soul" -Userfriendly

Replies are listed 'Best First'.
Re: Graphical overview of web site linking
by jkahn (Friar) on Nov 22, 2002 at 18:06 UTC
    This is a neat idea.

    I don't know of any pre-packaged solution, but I imagine that whipping up Graph objects from a crawler output wouldn't be too hard.

    Then you can use Graph::ReadWrite (specifically, the Graph::Writer::Dot module) to dump them to a .dot file (for use with http://www.graphviz.org's dot tool). This gives pretty graphical output, though sometimes big.

    It never ceases to amaze me how much more impressed the pointy-heads are when you have pretty pictures.

Re: Graphical overview of web site linking
by traveler (Parson) on Nov 22, 2002 at 18:06 UTC
    There are lots of tools for this. I do not know which are perl and which are not. Some appear to be Java. Here are a couple of pointers (pun intended):

    One tool is webtracer; cybergeography has links to lots of other tools.

    HTH, --traveler