Tuesday, 22 September 2009

More notes on testing & checking

Read Michael's series on Testing vs Checking. I've also made observations here and here.

Since seeing the first post I've been thinking, why?

Why the need for the distinction. I've been getting along ok without the distinction. I'm by no means a perfect example of how a tester should communicate - but one thing I do quite well in my work environment is to tailor the message to the receiver (or at least I try!)

That may be one reason that I haven't been bitten by the testing/checking bug. Or has my mind been blown? Ahhhh....

Anyway, about 5 days ago I wanted someone to point out the root problem, the root cause for the distinction (or at least one of the root problems.) My idea was that the distinction arose from communication problems and so I twitter'd to Michael about this:-

YorkyAbroad: @michaelbolton Nice latest post. The root cause of the need for the distinction hasn't been addressed yet. Or?
about 6 days ago

michaelbolton: A test, for you: what *might* be the cause for the need of the distinction?
about 5 days ago

michaelbolton: That is, I'm asking you: if you *see* the distinction (between testing and checking), what use could you make of it yourself?
about 5 days ago

YorkyAbroad: @michaelbolton Understand the diff bet. checking & testing as presented. A symptom of a problem is being addressed not the cause. Waiting...
about 5 days ago

michaelbolton: Sorry that you're not willing to respond to my request. You'll have your answer within a few days; please bear with me.
about 5 days ago

YorkyAbroad: @michaelbolton See checking akin to part of a test question. Little scope for checking that is not preceded or followed by sapience.
about 5 days ago

YorkyAbroad: @michaelbolton Am willing. Distinction needed in the communication (a root issue) of observations & feedback into a questioning approach.
about 5 days ago

michaelbolton: I&#39m finding it hard to read you in < 140. Why not send an email; I&#39d appreciate that.
about 5 days ago

YorkyAbroad: @michaelbolton I was "testing" if "the woods from the trees" was being distinguished. Will work on my own answer. Look forward to yours.
about 5 days ago

If anything is obvious from the exchanges it's how limitting 140 chars can be when trying to communicate more complicated questions... You might also notice the Socratic fencing/avoidance - wanting each other to answer their own question. So time to resort to email.

Bad checking? Communication?
The following is extracted from a mail I sent to Michael detailing some thoughts on bad/good checking and communication, I thought they'd be interesting to insert here:-

In my own area of work we have a fair amount of both testing and checking.

I can think of good checking (the methodolgy not the action) and bad checking.

Bad checking is where a regression suite is used to give a "verdict" on a software build. I liked your last post questioning if the tester is looking at the whole picture rather than just waiting for a "pass" result and moving on. But it can also be bad checking if we set off a suite of tests without asking the questions; "what is the goal here with these tests? What areas of the product am I more/less interested in? Should I pay particular attention to feature X?"

These questions have to be asked before executing a test manually or by automation - hard work maybe but if we don't ask them then we slip into the "regression coma" or your green bar pacifier. This is bad checking - or maybe just checking but "bad" testing.

So, to distinguish "good/bad checking" - good checking is preceded by a bunch of questions - some more vague than others - but they are there for a purpose.They have to be there: Why test something without any purpose? Learning, going on gut instinct, "I've seen a problem in this area in earlier releases..." are all valid purposes for targetting a test session. But testing without thinking is a problem and dangerous - as the "question/goal" mutates into "I don't want to find a problem" and becomes a self-fulfilling goal (in the long run!)

That's why I think that checking and sapience can never get very far apart. The "check" is an element of the test - where the test is preceded by the question or goal of the test (pre-defined or not.) So understanding the distinction (between testing and checking) also means needing to understand if a tester is asking any questions and appreciating that a check is an element of a test.

On to one of the root problems that I alluded to: Communication, specifically between tester and non-tester (although it can very easily be tester-tester also.) I'm thinking of the issue where people are talking past each other - a good example was in a post by Elisabeth: http://testobsessed.com/2009/05/18/from-the-mailbox-fully-automated-gui-testing/

Communication is a must in testing - whether it's understanding a requirement, a scope of work from a stakeholder, relating the observations made during a test session, making recommendations about areas further work, etc, etc. BUT, also communicating what's going on at any stage of a testing - different types of test and their different purposes - maybe even the different goals/questions going along with each.

If testers (or test leaders) can't get that information over to the receiver then there will always be the "testing trap" - the testing information black hole. If a tester now thinks they can answer a dev or stakeholder by throwing "check, test, sapience" terminology at them they're on equally shaky ground as they were before (if they didn't understand the differences in test types and how to communicate them.)

The check vs test distinction can help but only if you had a problem communicating about your testing activities and understand that you have/had that problem. The distinction goes part-way but I think it'd be even better to start with an "enabler": a question that a tester can use to discover/realise that this problem exists for them.

If you read Michael's posts you'll see that he's answering questions that are coming into him left, right and centre. It really is a discussion (rather than a monologue) - exactly in the spirit of a good blog. That's good to see, and commendable!

More digging
I wanted to follow-up the communication angle by starting a discussion on STC & TR about communication experiences where the testing/checking distinction might have been a factor (a help or hindrance), but I think it's all getting a bit obscure now - either people don't follow the whole thread, don't understand (mine & other's bad communication) or they just don't have the need/motivation to be interested (a big proportion I suspect!)

Communication Follow-Up
However, I'm fascinated by communication and information flow - so I might re-jig the question to be more general and follow-up on how testers experience communication issues and get around them...

I'm very much in the Tor Norretranders school of information flow - that the sender and receiver have to share perspectives (context even) before they understand each other!!!

This synch'ing of perspective takes time! So if someone doesn't understand your message the first time around it could be poor articulation, but also that you're both not sharing common ground. Yet!

I'm not quite signing out of the testing/checking discussion but I'm following other things in the pipeline, blog posts I want to finish, a couple of conference presentations plus my regular day job :)

Have you ever experienced difficulty getting your message across?


  1. Short answer: Yes

    Slightly longer answer: Saw your post on STC but didnt quite understand what it was you were after

    I am learning how to communicate better, blogs like this that are making me think are useful

  2. Thanks Phil!

    Once upon a time it was education, education, education then location, location, location...

    I'd like to think the next "obsession" is communication, communication, communication - and really to understand and improve it.

  3. Hi Simon,

    I don't see communication as the reason why we need to distinguish between the two, but perhaps I didn't fully understand your post :)

    For me the testing and checking differentiation is a great.I can see the benefit in a few ways:
    1) clarification what to automate and what not to automate.
    2) in agile techniques, it can be used as a way of explaining what else apart from automation that a tester can bring to the table
    3) can challenge us testers to review how we regression test, and challenge us to perform more testing in our regression.

    I suspect I may have missed your point though, by the time I got through and understood the twitter conversation, I had gotten a bit lost.

  4. Hi Anne-Marie,

    I think I understand your points :) Thanks for the comments!

    When I talk about communication it's at the level about being able to differentiate those types of test aspects (the so-called test or check) and even talk about them (explain the differences) to somebody else.

    If a tester can do this already then I suspect they don't need the labels of check and test. That has been the essence of my posts.

    I can understand that the distinction /can/ help those that can't make the distinction - but then I come back to "what is the root problem?" I would really like to understand the root problem...

    On your points:

    1. What to automate (or not) has more factors than just if something is machine-detectable - and I would hope that most testers would understand the machine-detectable parts without a change of terminology.

    2. I don't think the distinction has to be limited to agile and I think the "parts brought to the table" can be described with existing terminology.

    3. I've indicated in all these posts that regression testing requires the "thought process" to determine the selection, determine the questions to be answered and also to help interpret the results. This can (and I hope, should) happen irrespective of terminology.

    Of course, I'm giving /my/ answers based on my experience - I have no idea if the terminology is needed for a tester taken at random or whether something else would suit them better than new terminology. It's a case-by-case scenario.

    In the course of this reply I almost created my own terminology... "make the distinction" was "make the distinguishment" ;-)

  5. Hi Simon,

    I'm with Anne-Marie in that the distinction makes it much easier for me to communicate to non-testers where we can and cannot automate.

    It does sometimes feel like distinctions are made as a way of giving credit to our industry or as a way of making testing sound more important. But this distinction between checking and testing is one that I can associate with.

    In fact, I've been spouting the same stuff for years to my teams and management, just not using those terms and not in as clear and distinct ways.

    I like your questioning of it and I think it's good to try and understand communication in software testing.

    Not sure whether you saw any of my early posts over at PAC (http://pac-testing.blogspot.com) but these were aimed at understanding communication.

    I look forward to seeing your output from your investigations and journey in to communications.

    Where are your presentations? anywhere near rainy England?


  6. Hi Rob,

    Where communication is concerned it's very much a personal thing - which is one reason I've been saying that I don't think /I'm/ missing the distinction - but that doesn't mean that it doesn't work for somebody else. (The flip-side of inductive logic for the philosophers in the crowd. Sorry.)

    My main reason for asking /why/ was to get behind/beneath/past the distinction. I don't think I've done that yet - I don't think anyone else has either. Again, that shouldn't be a barrier to people adopting the distinction if that's what feels right for them.

    If and when I do get behind it, I think it will get me a better picture... Any "enlightenment" will be passed on.

    A lot of my posts started as "thinking aloud" posts - I really do appreciate all comments here - as it makes me think! So if you're wondering about commenting or not, come on down - nobody bites. Or just email...

    Presentations: Sweden & UK (I haven't confirmed this one yet)