Beefy Boxes and Bandwidth Generously Provided by pair Networks
more useful options
 
PerlMonks  

Re^5: WWW::Mechanize - error on geting non-existing page

by GrandFather (Saint)
on Nov 06, 2011 at 20:14 UTC ( [id://936334]=note: print w/replies, xml ) Need Help??


in reply to Re^4: WWW::Mechanize - error on geting non-existing page
in thread WWW::Mechanize - error on geting non-existing page

I hear where you are coming from, but it sounds a little like "We didn't have warnings in the past, we always checked using defined. Why should we switch to using them now?". To me the difference is between having to check everywhere and maybe getting bitten bad if you missed something, versus getting a noisy failure if something unanticipated happens.

In most cases I'd rather have a noisy failure and deal with it than a quiet failure that may just plough on and destroy the world.

True laziness is hard work

Replies are listed 'Best First'.
Re^6: WWW::Mechanize - error on geting non-existing page
by Your Mother (Archbishop) on Nov 06, 2011 at 21:50 UTC

    Let me put it differently. Mech is a browser emulator. How sensible would it be for a browser to quit on 404s? Mech is a subclass of LWP::UserAgent. How sensible is it to change default behavior radically? Both are obvious mistakes.

    This particular change was supposed to help new users, one imagines, but I’ve seen this question come up constantly since it was made: What the heck is going on? My script just quits!? And for adept users it broke all existing scripts and forces a new line of code in everything. I love the functionality of being able to put in your own code ref in but having it be the default is a mistake: conceptually, historically, and expectation-wise.

      Ah, yes I agree that changing default behaviour in that way is just a bad nasty horrible idea!

      True laziness is hard work
      On moving forward and breaking compatibility is a gentle rant about precisely this. We had about 2 years of "crap, my recovery script I run once every forever is dying and $APPLICATION is dead in the water! HALP!" emails at $PREVIOUS_WORKPLACE because of this change. The old behavior would allow these scripts to work because autocheck was off, and the scripts were checking the errors; once it was on automatically, the scripts would crap out despite the "proper" error handling being in place.

      The middle of an emergency is not when you want to try to explain how this change that has broken things and is costing money was, really, a good idea, and how, yes, they will have to patch the script if they want to move forward, and yes, they will have to do this on the production machine, and yes, they will have to get all the relevant people out of bed, and no, I don't think that's where the author's head was.

      As I mention in the above node, this could have been done better, with examples of how so.

      1.49_01     Sat Sep 27 23:50:04 CDT 2008
      ========================================
      [THINGS THAT MAY BREAK YOUR CODE]
      The autocheck argument to the constructor is now ON by default,
      unless WWW::Mechanize is being subclassed.  There are so many new
      programmers whose ->get() calls fail unchecked that I'm now putting
      on the seat belts for them.
      

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://936334]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others admiring the Monastery: (5)
As of 2024-03-19 10:01 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found