This is just a quick note to make you aware of a little project I have worked on, At the moment it allows you to subscribe to a number of RSS feeds to various parts of, such as the latest questions asked or to say the latest posts by a certain user. You can also subscribe to a post and get a feed with all of its replies. Please see for more information.

As a general rule use an url such as replacing 123456 with the id of the user or node you want to watch.

If you wanted to subscribe to this node you would use

I should stress that this is a work in progress and so would be grateful for any feedback or requests, either here or by email.

Hope you find it useful.

--tidiness is the memory loss of environmental mnemonics

Replies are listed 'Best First'.
Re: RSS feeds to most of
by theorbtwo (Prior) on Feb 27, 2005 at 16:56 UTC

    If you want RSS, I strongly suggest you work with us, as a pmdevil, to get PM itself to produce it. That way, you don't need to hammer the server, and the data can always be up-to-date.

    Warning: Unless otherwise stated, code is untested. Do not use without understanding. Code is posted in the hopes it is useful, but without warranty. All copyrights are relinquished into the public domain unless otherwise stated. I am not an angel. I am capable of error, and err on a fairly regular basis. If I made a mistake, please let me know (such as by replying to this node).

      Where do I sign up. I'd gladly get this working as a part of rather than the way it is done at the moment.

      As for hammering the server though I have taken efforts to prevent doing this. My daemon does fetch every new node but by storing them locally it does not then need to refer to again to generate the RSS. This should reduce the load on perlmonks at the point where the number of RSS feed fetches is greater than the daily number of new nodes. I think this level has already been reached.

      What would be useful for keeping the copy current would be an 'recently updated' page similar to newest nodes.

      Still, best would be for this to happen on perlmonks, so who do I talk to to become a pmdev?

      --tidiness is the memory loss of environmental mnemonics

Re: RSS feeds to most of
by jdporter (Canon) on Feb 27, 2005 at 16:49 UTC
Re: RSS feeds to most of
by esskar (Deacon) on Feb 26, 2005 at 20:46 UTC
    great job. i like it. Is it done on the fly?
    How about adding a code-tag which includes the code of the item or a raw-tag which includes the item as raw CDATA.

      It is not done on the fly, there is a daemon which checks for new nodes and adds them to a database. This is the only way to do it without hammering This can mean that the nodes get out of date or end up in the wrong section. Hopefully I'll find a solution to this soon.

      As for adding the code etc I'm keen to keep it all qute simple and to send people back here for the actual nodes. I'm not aiming to replace perlmonks, just to add features that make it more useful.

      Glad you like it.

      --tidiness is the memory loss of environmental mnemonics

        In my ideal world, there would be such a link between the RSS feeds and PM that they'd always be within a few minutes of up-to-date, if not actually always up-to-date. And there'd be a link in the header of the html code which Firefox could use to find out about the RSS feed.

        In my slightly less ideal world, we'd just get the link in the html code to this RSS feed (the one for the current node only - whatever that current node is, although some supernodes may not need it - such as the comment on node. This would still require a bit of work from the PM developers, although I would hope not much. Something like:

        <link rel="alternate" title="PerlMonks RSS" href=" +/rss/$nodeid.xml" type="application/rss+xml" />
        would need to be added to the header

        In my real-as-in-now world, I'd like to express my appreciation of such a service! I just need to figure out how to get this somewhat automated :-)

        Good stuff - well done. I have added this as a live bookmark straight into Firefox.

        You mentioned that your process used a daemon that checks for new nodes. This means that firstly, you need to run a daemon and secondly, you are interrogating PM on a regular basis.

        You could simplify the model by caching the RSS for a particular page and interrogating the cache each time you wanted to serve a page. A cached page could time out after a short period of time (e.g. 10 mins). A cache miss (or timed out page) would initiate a request to the monastery. The result would be cached for next time. This means that when nobody was using the feed, PM wouldn't be hit.

        Caching can be implemented using a simple file cache with timestamp checking or something more involved using a database. Either way, you periodically need to clean the cache of expired documents. You would also want to guard against an attack where a malicious user tried to access every node as a feed and therefore used up lot's of cache space.

        This article may be of interest with respect to the database solution.

Re: RSS feeds to most of
by jdporter (Canon) on Mar 05, 2005 at 15:21 UTC
    btw - on another note - perhaps you'd like to consider hosting your perlmonks-related stuff on, which is sort of the de facto place for monks to put their bits that can't go on perlmonks itself. Not that you have to, of course; but it is generally the first place people expect to find such things.
Re: RSS feeds to most of
by jdporter (Canon) on Feb 22, 2006 at 05:21 UTC

    In the "to do" section it says, "Provide the RSS feeds in HTML form". I hope you're still working on that, because it's really necessary.

    In the mean time, I've created a wrapper script which converts your rss to html.
    I use it in this fun Free Nodelet Hack: 'Newest Nodes' Menubar.

    We're building the house of the future together.
Re: RSS feeds to most of
by jdporter (Canon) on Jan 06, 2009 at 18:19 UTC

    I regret to inform that this service has been down for over two years and EvdB has been unresponsive, so the project should be considered defunct.