Beefy Boxes and Bandwidth Generously Provided by pair Networks DiBona
Clear questions and runnable code
get the best and fastest answer
 
PerlMonks  

RE: (Guildenstern) Personal Monk Stats plot creator

by Guildenstern (Deacon)
on Aug 29, 2000 at 19:12 UTC ( [id://30192]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Personal PerlMonks Stats plot creator

Nice work! I like being able to see this info in a graphical context. I only have a couple of suggestions for improving it. These came up through personal trial and error trying to get this to run for myself.
  1. Suggest using something else than PerlMonksChat for getting a user's page. This one gave me a lot of problems, and I ended up throwing it away in favor of a request that looks like this :
    my $url ="$pmsite?user=$username&" ."passwd=$password&op=login&node=perl+monks+user+search&" ."usersearch=$username&orderby=createtime%20desc&count=$i";

    (Code shamelessly stolen from jcwren's luke_repwalker.pl). This brings up point 2,
  2. Some people (not myself..yet) have more than 50 writeups, so the url query will have to be repeated multiple times in order to get all writeups. Granted, your code will give a good synopsis, but I imagine there are people out there who want all the info at once. For an example of how to do multiple fetches to get all of the writeups, look again to luke_repwalker
Other than those suggestions, I would say your code works great for me.
Update (2): Fixed the code wrap. Thnx nuance.
Guildenstern
Negaterd character class uber alles!

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://30192]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.