Beefy Boxes and Bandwidth Generously Provided by pair Networks RobOMonk
The stupid question is the question not asked
 
PerlMonks  

Re^2: RSS feeds to most of perlmonks.org

by EvdB (Deacon)
on Feb 27, 2005 at 12:16 UTC ( [id://434896]=note: print w/replies, xml ) Need Help??

This is an archived low-energy page for bots and other anonmyous visitors. Please sign up if you are a human and want to interact.


in reply to Re: RSS feeds to most of perlmonks.org
in thread RSS feeds to most of perlmonks.org

Where do I sign up. I'd gladly get this working as a part of perlmonks.org rather than the way it is done at the moment.

As for hammering the server though I have taken efforts to prevent doing this. My daemon does fetch every new node but by storing them locally it does not then need to refer to perlmonks.org again to generate the RSS. This should reduce the load on perlmonks at the point where the number of RSS feed fetches is greater than the daily number of new nodes. I think this level has already been reached.

What would be useful for keeping the copy current would be an 'recently updated' page similar to newest nodes.

Still, best would be for this to happen on perlmonks, so who do I talk to to become a pmdev?

--tidiness is the memory loss of environmental mnemonics

Replies are listed 'Best First'.
Re^3: RSS feeds to most of perlmonks.org
by demerphq (Chancellor) on Feb 28, 2005 at 07:55 UTC

      I am already using the XML pages but was unaware of the 'flat' option so thank you for that.

      As mentioned I generate all feeds from a local copy so in effect I am mirroring the site and not adding any (considerable) load.

      --tidiness is the memory loss of environmental mnemonics

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://434896]
help
Sections?
Information?
Find Nodes?
Leftovers?
    Notices?
    hippoepoptai's answer Re: how do I set a cookie and redirect was blessed by hippo!
    erzuuliAnonymous Monks are no longer allowed to use Super Search, due to an excessive use of this resource by robots.