Friday 19 February 2016

The Conway Heuristic

If you have not read Polya's "How to Solve It", [1], and have an interest in heuristic approaches to problem solving then I'd recommend it.

There are many heuristic approaches to problem solving that you probably use without realising it. The book may just help spot and discover new problem solving heuristics in the wild, [2].

It was whilst reading a version of Polya's book, with a forward by John Conway, that I read this passage:
It is a paradoxical truth that to teach mathematics well, one must also know how to misunderstand it at least to the extent one’s students do! If a teacher’s statement can be parsed in two or more ways, it goes without saying that some students will understand it one way and others another, with results that can vary from the hilarious to the tragic.
This has a clear application (to my mind) to testing. Think of specifications, requirements or any type of communication. We are generally very good at imprecise language so the risk for miscommunication, misunderstanding, hidden assumptions remaining hidden or unsynchronised priorities is real. I'm sure we all have our favourite misunderstanding war stories.

In about 2012, [3], I re-stated this paragraph specifically for communication as:-


















I called it the Conway Heuristic. It's something I use from time to time - to remind me that there might be a loose agreement or understanding - or a shallow agreement. I tend to think of this when there are weasel words or phrases about. For example, saying something is "tested" - which can be interpreted as fault-free, where in fact there may be several issues remaining.

This is a heuristic for a couple of reasons: (1) it's a guide, and not fool-proof; (2) it's impossible to think of all the ways something would be misinterpreted. So, this is like a useful check question, "is there or could there be a problem here?" If it helps remove the big mistakes then it's done its job.

So, anywhere where there are context-free questions or context-free reporting - or really, generalised or context-free communication, then keep this heuristic in mind. There be dragons....

References
[1] How to Solve It: A New Aspect of Mathematical Method; Polya
[2] On Thinking about Heuristic Discovery
[3] Testing Lessons from The Rolling Stones

Sunday 14 February 2016

Testing. What was the question?

Details, details, details. It seems sometimes people get sucked into debates and discussions on details and forget a bigger picture - or that there is a bigger picture. 

TL;DR -> Jump to the Thoughts and Observations part

Do you recognise any of these?
  • Is it right to say I am automating a test?
  • Is it ok to have lots of automated tests?
  • Is it right to have a standard for software testing?
  • Do I need to distinguish between scripts that I code or encode and my testing?
Notice any pattern? The discussion usually focusses on the detail. This might be ok... But sometimes it's useful to have a tool, aid or heuristic to help judge if this detail is needed or not. So, here's my control question for you, if you end up in such a discussion. 
  • What set of questions are you working with that you want your testing to help answer? 
Meaning, what is the relation to your teams', or projects' or programs' development activity? So, really,
  • WHY are you testing?
  • HOW are you testing?
  • WHAT are you testing?
Yes, it starts with "why". Depending on how you view or frame, [1], your testing will help you understand if the detail is needed, needed now or not.

The WHY
This could circle around ambitions, visions or goals of a team, company, product, campaign, program etc. 
  • Who is the receiver of my test information? Are there safety critical or contractual aspects involved? (e.g. Visibility of testing whether by artefact or verbal report? Have I made a stakeholder analysis?
  • How long will the product live and what type of support/development will it have? (e.g. Support needs, re-use of scripts as change checkers? Supporting tools and framework to leave behind?
  • What are the over-aching needs of the information from testing? (e.g. See reference [2])
  • Are there other factors to work with? (e.g. product or company strategic factors)
  • Check-question: 
    • Does answering the "why" help describe if testing is needed, in which cases it supports the product or team? Answering this question usually helps working with the "how" and "what" of testing.
The HOW
Depending on the factors that the team, program or project comes up with to work out the WHY some testing will be needed, the HOW usually gets into the detail and factors affecting the approaches of testing.
  • What type of test strategy is needed, or what factors should I think about? (e.g. see references [4])
  • What items or types of information do my stakeholders really want? (e.g. see references [3])
  • How does the development and/or deployment pipeline look? Staging areas? Trial areas?
  • Does the team have dedicated equipment, share or use CI machinery to support? Orchestration factors?
  • Will the test strategy need tool support?
  • Is the test target a GUI, a protocol layer, some back-end interface, combination or something else?
  • How do I and my team iterate through test results, assessing where we are now, and where we need to be? Do we have the right skills? (e.g. see references [6] & [7])
  • Check-question: 
    • Does answering the "how" help describe where the testing fits into the development of the product (i.e. an end-to-end view)? If not, then you might not be done answering this question.
The WHAT
If and when you have worked out the "why" and the "how" then the artefact-heavy part of testing might be the next set of questions to work with. 
  • Split between existing script re-use and new script development
  • What heuristic inputs do I use? Are they sufficient? Do I notice new ones or refine some? (e.g. see references [5])
  • Now you can think about test design techniques (pick your favourite book, blog or list).
  • Extra tooling to create to support the testing (e.g. test selection and filtering aids).
  • Check-question:
    • The techniques and components chosen here should support the "how" and the "why". Do they? What might be missing?
Thoughts and Observations
Notice that the "details" usually fall into the "what" or sometimes the "how" of testing? But - if you're not connecting it to YOUR "why" of testing then you might be getting yourself into a rabbit-hole. You might be - from a team, product or product perspective - locally optimising the effort.
  • So, details are fine as long as you're explaining the why, the how and the what together - and for what (product, company, team). This is the context.
Another way to look at it - if you're getting caught in details or not connecting the "why", "what" and "how" of testing in the context of your product, program, company or team then you might just be trying to solve the wrong problem. For help on trying to work out the right problem I'd recommend starting with Gause & Weinberg [8].

References