Beefy Boxes and Bandwidth Generously Provided by pair Networks
Syntactic Confectionery Delight
 
PerlMonks  

(OT) Black- vs. white-box testing

by dragonchild (Archbishop)
on Oct 25, 2005 at 14:29 UTC ( #502727=perlmeditation: print w/ replies, xml ) Need Help??

In a subthread of Neither system testing nor user acceptance testing is the repeat of unit testing (OT), a very lively discussion has been taking place about testing and when unit-tests can be black-box, what's white-box, and various other related topics. I have been getting the sense that my usages of certain terms ("white-box" and "interface", to name a couple) are different from the rest of the world's. While this isn't news to me, I was a little surprised at the ... vigor that was expressed in the conversation.

In the interests of harmony between myself and others, I'm going to elaborate on how I view certain fundamental terms. In all cases, these are my personal views and opinions. As I have done many times in the past, I am more than willing to change my views on any and all of the following items if presented with a compelling argument.

  • CodeSmell

    A situation in your project that, on its face, will cause issues at some point. CodeSmells are derived the accumulated experience of the developer community's failed projects. Not all CodeSmells are necessarily bad, but any CodeSmell that is not changed should be documented as to why this CodeSmell is acceptable in this instance. Otherwise, the maintainer (which will be you in 6 months) will wonder why this CodeSmell was not addressed.

  • Corner-case

    This is a situation where a generalized rule fails on certain specific inputs or input-ranges. For example, division is a generalized rule that has a corner-case when the divisor is 0.

    My opinion is that only reality contains corner-cases. If your code has a corner-case that isn't in the spec, you have code that isn't necessary to meet the spec. (Or, more commonly, you have a crappy spec, but that just means you need to improve the spec.)

  • Specification (aka spec)

    This is where the real-world situation(s) that the code in question should handle are described. This includes the various situations that might arise, the algorithms that need to be handled, and what the code can assume about the world. This also includes any performance requirements/constraints, portability requirements/constraints, the framework that will surround this code (exceptions or not, etc), and anything else from a technical nature. Things that are not specified should be labelled as "implementation-dependent".

    Leap-years and leap-seconds are part of the spec for a date-handling module. Performance requirements like "This must run in under 1 second" are part of the spec.

    In legacy code, the interface may be the implementation, but that does not mean that one is necessarily white-box testing.

  • Interface

    The interface is how anything outside the piece of code in question will deal with that piece of code. This is the list of functions/methods/variables/whatever that the user of this code will have available to them. This list should contain what each of these items does. This includes any assumptions, parameters, and outputs. In the case of an exception-based system, this should include what exceptions are thrown, handled, and any other pieces of code this item uses, so that the user can look there for unhandled exceptions.

    It is very common to have a private interface as well as a public interface. This is where your private methods are tested. While this is very close to white-box testing, it is arguable that the public interface that you are providing uses a second section of code that is not for public consumption.

  • Unit-tests / testsuite

    The unit-tests should verify that a given implentation of the interface, as documented, satisfies the specification, as documented.

  • Implementation

    This is the actual code that does the work. It provides the given interface and satisifies the given specification. It may be optimized, or not.

  • Black-box testing

    These are tests written solely against the interface to validate the implementation against the specification.

  • White-box testing

    These are tests written against the implementation, not the specification. I consider white-box tests to be, in general, a CodeSmell. The only white-boxing I feel to be appropriate is mocking, and if mocking is required, that needs to be justified. (The justification may be as simple as "That's the spec.")

  • Test-Driven Development (aka TDD)

    The practice of writing a test, seeing it fail, writing some code, then seeing the test pass.

    Note: TDD is not necessarily white-box testing, as some have proposed. I only have to point to Pugs for a good example of this.

  • Coverage statistics

    The process of determining which codepoints have been exercised by your testsuite. If there is less than 100% coverage, this is a good place to determine if any of the following are true:

    • Your testsuite is incomplete vis-a-vis the specification
    • You have code that is not required to meet the specification
    • You have code that is untestable

    Either way, this is a CodeSmell that needs to be addressed. (Again, the addressing can be very simple.)


My criteria for good software:
  1. Does it work?
  2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
GrandFather added readmore tags

Comment on (OT) Black- vs. white-box testing
Re: (OT) Black- vs. white-box testing
by blazar (Canon) on Oct 25, 2005 at 14:41 UTC
    I'm not terribly familiar with any of the terms above. But of course I've heard and used them too, more or less naively, and I think that my intuition/perception/understanding if not perfect should be close to the accepted common sense about them...

    Now the issue about Black-box vs. White-box testing is interesting, and I must say that I hadn't even heard the latter expression before. I'm not even really sure if I understand what it should be... could you please explain to us less experienced programmers?

    I mean: I see how one can test an implementation against a specification of an interface. What are "tests written against the implementation"?

      I mean: I see how one can test an implementation against a specification of an interface. What are "tests written against the implementation"?

      One of the usual techniques is to examine the code as written and find all the decision points, which will lead to discovering all the possible paths of execution through the code. Tests can then be written to "exercise" each of these possible paths to ensure that every line of code behaves as expected.

      It is arguable how often this is useful and, of course, refactoring is likely to render some of these tests obsolete.

      Tests written against an implementation rely on internals of the code which are not part of the interface. As a very simple example, say you have a module which takes input A and B, merges that data into a data structure C, then performs some calculations on it and gives you back D. If you were testing against the interface(black-box), you'd write tests which put in A and B and check whether you get the correct D out. If you were to write a test against the implementation(white-box), you'd also check whether C conforms to your expectations or whether the algorithm to turn C into D works a particular way. If later the module writer decides to change the implementation and do without the intermediate C (or change it in some way), the test breaks.

        Then I agree with OP that such a practice (white-box tests) are, in general, a CodeSmell. If one considered the data structure C to be important enough to warrant a test, then he should refactor, or enrich the interface (spec) so as to include it too.
      Just as a simple suggestion, you might want to pick up a book called The Pragmatic Programmer. It will easily answer all those questions, and make you come up with more.

      TStanley
      --------
      The only thing necessary for the triumph of evil is for good men to do nothing -- Edmund Burke
Re: (OT) Black- vs. white-box testing
by idsfa (Vicar) on Oct 25, 2005 at 16:00 UTC

    White-box testing — tests designed against the code which actually implements the functionality — is critical to evaluate the security of the code. Black-box (monkeys with typewriters) pounding at potential vulnerabilities is simply too inefficient to be valuable. It is good to have some standard black boxes (like buffer overflows), but even better to know "oh, this string gets eval'd — I'd better write a test to make sure it won't do anything stupid"


    The intelligent reader will judge for himself. Without examining the facts fully and fairly, there is no way of knowing whether vox populi is really vox dei, or merely vox asinorum. — Cyrus H. Gordon
      ++ Thanks for that insight. For some reason I hadn't considered the security of code. But I guess that's why I'm not a programmer by trade. :o)
Re: (OT) Black- vs. white-box testing
by itub (Priest) on Oct 25, 2005 at 16:12 UTC
Specifications?
by Anonymous Monk on Oct 25, 2005 at 17:14 UTC
    What is this strange term? My managers know not this word, and it troubles me greatly, for the very mystery of it resounds across the officeplace with portents of doom! ;-)
Re: (OT) Black- vs. white-box testing
by pg (Canon) on Oct 25, 2005 at 17:34 UTC
    "In legacy code, the interface may be the implementation, but that does not mean that one is necessarily white-box testing."

    It is interesting that "white-box testing" suddenly showed up.

    I think there is a mix up. The way you test something has absolutely nothing to do with how it is spec'd out, or how it is implemented. The tester has a choice.

    As for white box and black box. Let's use maze as an example to explain this. If you draw your maze on a piece of paper, when I look at that piece of paper, I see a white box. If I am going through a real hay maze, I am seeing a black box.

      As for white box and black box. Let's use maze as an example to explain this. If you draw your maze on a piece of paper, when I look at that piece of paper, I see a white box. If I am going through a real hay maze, I am seeing a black box.

      No, white box testing is the map of all the distinct entrances and exits; you verify that each entrance leads to the correct exit, and that if you follow the right path, you'll come out in the right place. You're making sure that the design of the maze looks good, by looking at an arial map or satelite photo.

      Black box testing is like standing there with a clipboard and noting: "Bill went in exit A, and came out exit B. Check! Sally went in entrance B, and came out exit D. Check! Fred went in entrace C, we heard loud screaming, and then silence. Hmm... the system appears to ocassionally hang when entrance C is used, especially when the input person enters carrying buckets of blood or raw steaks... interesting..."

      Debugging is actually wandering through the actual maze, in order to hunt down the monsters inside. Overflying the maze with a helicopter and a shotgun is using "perl -d". :-)

Re: (OT) Black- vs. white-box testing
by tilly (Archbishop) on Oct 26, 2005 at 03:04 UTC
    If I had realized that you made this into a meditation, then Re^12: Corner cases are not always "code smell" would have been a response to this node rather than being buried deep in that thread.

    To clarify one point, if you create a private API to allow tests of private functions, I consider that to be white-box testing. Those are tests that cover internal aspects of behaviour that nobody else should rely on, which might change in future versions.

    I feel that people who include private functions as part of the specification either have or are in great danger of missing the entire point of modularity. Loose coupling means that you pretend that you don't know about privates.

      To clarify one point, if you create a private API to allow tests of private functions, I consider that to be white-box testing. Those are tests that cover internal aspects of behaviour that nobody else should rely on, which might change in future versions.

      I would consider this black-box testing of the private API. The API you expose to the world and the API you expose to your code are rarely going to be identical. Of course, you should test the complete API, not just the API you expose to the world. If the rest of the world considers this white-box, then, in my public API, I'll consider this white-box as well. What it maps to in my personal implementation of this spec is irrelevant. :-)


      My criteria for good software:
      1. Does it work?
      2. Can someone else come in, make a change, and be reasonably certain no bugs were introduced?
        The reason why I call that white-box testing is that the private API is often explicitly designed to shine a white light on otherwise dark corners of the implementation. For instance in the regex example, you might have a flag that is available telling you which version was actually run. That allows something which should be invisible (whether you've turned on the optimization) to be tested.
Re: (OT) Black- vs. white-box testing
by snowhare (Friar) on Oct 26, 2005 at 04:19 UTC

    Any software of more than moderate complexity is impossible to fully test via blackbox testing. You can verify that for the inputs you tried it works - but without digging into the implementation you have no way to prove that although it works for Z, Y and X that it also will work for W (which may cross an internal boundary in the implementation). Exhaustive blackbox testing is usually impossible.

    I remember once upon a time in high school (now a few decades in the past) that I solved a Calculus problem and got a numeric test case answer correct to 4 decimal places. My solution was wrong - I had by pure dumb luck picked a test case where my mistakes just happened to cancel out.

    My numeric test case was blackbox testing. The whitebox testing of examining my actual solution was what told me that even though my test case passed, the problem solution was not correct.

    Both have their place.

Log In?
Username:
Password:

What's my password?
Create A New User
Node Status?
node history
Node Type: perlmeditation [id://502727]
Approved by Corion
Front-paged by rob_au
help
Chatterbox?
and the web crawler heard nothing...

How do I use this? | Other CB clients
Other Users?
Others pondering the Monastery: (6)
As of 2014-07-26 10:42 GMT
Sections?
Information?
Find Nodes?
Leftovers?
    Voting Booth?

    My favorite superfluous repetitious redundant duplicative phrase is:









    Results (175 votes), past polls