Beefy Boxes and Bandwidth Generously Provided by pair Networks
Come for the quick hacks, stay for the epiphanies.
 
PerlMonks  

Re^2: Testing IS Development

by JavaFan (Canon)
on Mar 10, 2009 at 22:14 UTC ( [id://749747]=note: print w/replies, xml ) Need Help??


in reply to Re: Testing IS Development
in thread Testing IS Development

The benefit of test driven development is, you *know* your code will work before you ship it. Although, that is contingent upon whether you've written thorough and proper enough tests.
I find a significant portions of the "bugs" I make come from making the wrong assumptions (which can be caused by many things: wrong or unclear documentations, bad communication between the product owner and the dev team, me not checking facts with the right people) etc. Making the wrong assumptions usually means the tests are wrong. So while I write code that passes all tests, I still don't have code that "works".

Don't get me wrong. I do see value in tests. But I've been programming for way too long to see the silver bullet in anything. Only for trivial code will "passes all tests" mean "the code is bugfree". A test suite is just a tool. Just like use strict and use warnings. They're all just modest tools in writing code.

Replies are listed 'Best First'.
Re^3: Testing IS Development
by sundialsvc4 (Abbot) on Mar 10, 2009 at 22:24 UTC

    Granted. Now, having said that, here is an assumption that we all tend to make, albeit without firm basis:   the assumption that the code we've just written is “right.” We trust our own experience, and let's face it, our own gut instinct. Granted, that experience/instinct is by now very well-honed, and therefore trustworthy ... but there is still plenty of room for “a digital computer to do for us what a digital computer does best,” namely, to grind through an onerous procedure in just a few seconds.

    So, yes. It is “just a tool.” It is also, “a darn good one.” We have no dispute there (nor anywhere). It is a tool that, I now realize, I have not yet availed myself of to a sufficient degree. I guess that the cobbler's children tend to have no shoes.

Re^3: Testing IS Development
by dsheroh (Monsignor) on Mar 11, 2009 at 12:14 UTC
    I find a significant portions of the "bugs" I make come from making the wrong assumptions

    True. OTOH, I find that writing tests forces me to codify and explicitly state my assumptions (even if not in a form the typical end-user would understand), which, in turn, forces me to think about and identify those assumptions.

    By making me consciously aware of my assumptions, it serves as a first step towards finding and correcting those which are incorrect. It can also prove useful in implementing the correct behaviours once incorrect assumptions are identified - fixing the tests, if they are well-written, often clarifies what the code needs to do differently.

      I had a co-worker many years ago that gave me some really frustrating (and insightful) advice on troubleshooting:

      First, check all your explicit assumptions. Then, check all of your implicit assumptions.

      Obviously, the second one is almost impossible. I find that writing unit tests (and TDD) helps to change some of the implicit assumptions into explicit assumptions. To me, this seems to be a benefit. (Of course, that might just be another implicit assumption.<grin/>

      G. Wade
      True. OTOH, I find that writing tests forces me to codify and explicitly state my assumptions (even if not in a form the typical end-user would understand), which, in turn, forces me to think about and identify those assumptions.
      That is only added value if you don't think about assumptions when you are coding.

      I generally do. I don't suddenly consider assumptions more when I code tests than when I write write code. And I'm not talking about assumptions like "snow is always white". I'm talking "assume the data we're interested in is in table X in database Y on server Z" and I assume that because the company wiki says so. But then it turns out that table Z.Y.X is obsolete, and currently the data lives in tables A, B, C on database D on server E. Testing is not going to find that, because when you make your tests, you make mock data from table Z.Y.X. Tests succeed. Code would have worked fine if indeed the report used data from table Z.Y.X. But since the assumption is wrong - the entire chain falls.

        The tests of which you speak here really move into the realm of process data integrity, not the specific testing of any particular application. Like any “manufacturing production-line,” the shop must have the means to validate where the data is actually coming from, and that the correct parameters were specified to the applications that were run. This is an ongoing part of the daily production process.

        This presupposes, of course, that the applications themselves are “known good,” such that it's all essentially worthless if they're not. In other words, they do have a test-suite, it does validate the handling of the data that is flowing through each application, and it does also check that not-valid data will be detected and rejected. Each time an application is deployed to the production environment (by the personnel that is responsible for that ... not the developers themselves), it must clear all tests.

        So, the two concerns are complementary to each other, not exclusive.

Re^3: Testing IS Development
by Anonymous Monk on Mar 11, 2009 at 12:26 UTC
    product owner and dev team should write some tests

Log In?
Username:
Password:

What's my password?
Create A New User
Domain Nodelet?
Node Status?
node history
Node Type: note [id://749747]
help
Chatterbox?
and the web crawler heard nothing...

How do I use this?Last hourOther CB clients
Other Users?
Others chanting in the Monastery: (4)
As of 2024-03-19 05:09 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    No recent polls found