Beefy Boxes and Bandwidth Generously Provided by pair Networks
XP is just a number
 
PerlMonks  

Re: Graph your Perl inheritance structure

by ZZamboni (Curate)
on Jun 11, 2001 at 02:02 UTC ( [id://87364]=note: print w/replies, xml ) Need Help??


in reply to Graph your Perl inheritance structure

This is very neat. I have made a patch (see below) that:
  • Makes it ignore POD lines.
  • Makes it deal with multi-line @ISA declarations such as:
    @ISA=qw(class1
            class2);
    
    Of course, the only foolproof way to get @ISA would be to load the package and examine its value, but this seems to work OK.
For an example of a graph generated by this program, see this graph, which corresponds to the implementation of this project. :-)

--ZZamboni

Here's the patch:

--- ingraph.pl.orig Sun Jun 10 16:52:08 2001 +++ ingraph.pl Sun Jun 10 17:00:00 2001 @@ -86,16 +86,31 @@ STDERR->print("processing $file\n") if $opts{v}; my $f = IO::File->new($file) or warn "can't open $file: $!\n", next; my ($package, @isa); + my $pod=0; while (<$f>) { + if (/^=cut/) { + $pod=0; + next; + } + if (/^=[a-zA-Z]+/) { + $pod=1; + next; + } + next if $pod; if (/^\s*package\s+([[:word:]:]+)\s*;/) { $package = $1; next; } - if (/@(?:([[:word:]:]+)::)?ISA\s*=\s*(.*)\s*;/) + if (/@(?:([[:word:]:]+)::)?ISA\s*=\s*(.*)\s*/) { - @isa = eval $2; + my $tmp=$2; + while (!/;/) { + $_=<$f>; + $tmp.=$_; + } + @isa = eval $tmp; if ($@) { warn "Unparseable \@ISA line: $_"; next } $package = $1 if defined($1); STDERR->print("package=$package, \@ISA=", join(',', @isa), "\n") i +f $opts{v};

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://87364]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others sharing their wisdom with the Monastery: (5)
As of 2025-06-19 12:13 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found

    Notices?
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.