Beefy Boxes and Bandwidth Generously Provided by pair Networks
Pathologically Eclectic Rubbish Lister
 
PerlMonks  

comment on

( [id://3333]=superdoc: print w/replies, xml ) Need Help??

Hi folks.

Performance reasons mandated a change to the user nodes xml ticker. Users with many posts were causing undue load on the DB server so we had to change it. The changes are as follows:

  • The ticker returns no more than 100 records at a time. You can use the limit parameter to request less than this number per fetch if you like.
  • New arguments for paging through the result are available. You may use the offset parameter to controll how far into the data set you want.
  • Additionally you can control the result set ordering by using the order parameter which accepts the arguments 'asc' or 'desc'.
  • Lastly you can request all new nodes since a particular node id by using the fromid parameter. This is the recommended mode of use as it enables you to write an client that querys in an incremental fashion, without requerying all the stuff that it has already seen.

Its well known to the gods that the primary existing use of the user nodes ticker is to monitor reputation changes in the users posted nodes. We would prefer that people stop doing this, as it requires fetching an awful lot of nodes. In order to get people to stop doing this we have created the noderep xml ticker which is designed to provide data on only those nodes that have changed since the last fetch. Please read the following list to see how it works:

  • On the first fetch the ticker will return the results on the last 24 hours voting.
  • On subsequent fetches the ticker returns any changes since the last fetch or the last 24 hours, whichever is more recent. The ticker stores the time since the last fetch by a given user internally. This means you cannot have two clients consuming data from it simultaneously. They wont get the same results. This is a deliberate design decision, so don't expect it to change.
  • You may not fetch from the ticker more frequently than every 600 seconds. Attempts to fetch more frequently will result in an xml error message being returned. This restriction is partly to keep he DB load down, but mostly to discourage people from writing "who downvoted me" tools. If we feel such tools are being produced the fetch period may be extended. Client writers should use the data in the 'info' tag of the result set to automatically control the interval between fetches.
  • In order to make designing new clients for this ticker easier the two restrictions above are relaxed somewhat for the first 50 fetches. In this time period you may request the data set whenever you like, and use the clear option to reset the internal state of the ticker to "never fetched before". This should allow client writers sufficient opportunity to get their clients right before the polling frequency restriction kicks in. (At request the gods may reset the counter if you can justify why.)
  • Important: as this is a new release we reserve the right to make any changes necessary to the feed as the circumstances arise. IOW, if it turns out that this ticker is as much of DB pig as the user nodes ticker used to be then the ticker will be modified or even disabled. Hopefully nothing like this will be necessary.

So now all you XP whores can use the ticker to watch your nodereps in something close to real time. Enjoy. ;-)

---
$world=~s/war/peace/g


In reply to Changes to the User Nodes ticker and introducing the NodeRep XML ticker by demerphq

Title:
Use:  <p> text here (a paragraph) </p>
and:  <code> code here </code>
to format your post; it's "PerlMonks-approved HTML":



  • Are you posting in the right place? Check out Where do I post X? to know for sure.
  • Posts may use any of the Perl Monks Approved HTML tags. Currently these include the following:
    <code> <a> <b> <big> <blockquote> <br /> <dd> <dl> <dt> <em> <font> <h1> <h2> <h3> <h4> <h5> <h6> <hr /> <i> <li> <nbsp> <ol> <p> <small> <strike> <strong> <sub> <sup> <table> <td> <th> <tr> <tt> <u> <ul>
  • Snippets of code should be wrapped in <code> tags not <pre> tags. In fact, <pre> tags should generally be avoided. If they must be used, extreme care should be taken to ensure that their contents do not have long lines (<70 chars), in order to prevent horizontal scrolling (and possible janitor intervention).
  • Want more info? How to link or How to display code and escape characters are good places to start.
Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others browsing the Monastery: (4)
As of 2024-04-24 00:32 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found