Monday, 22 August 2011

Framing: Some Decision and Analysis Frames in Testing


" #softwaretesting #testing "

What is a Frame?
The following is from Tversky and Kahneman's description of a decision frame, ref [1],:
We use the term "decision frame" to refer to the decision-maker's conception of the acts, outcomes, and contingencies associated with a particular choice. 
The frame that a decision-maker adopts is controlled partly by the formulation of the problem and partly by the norms, habits, and personal characteristics o f the decision-maker
When using a decision frame to analyze a problem and come to a decision they call this framing. So, I'll refer to a frame as relating to a decision (or analysis) frame.

Factors at play
As mentioned, many different factors affect how we analyze problems, including:

Temperament/Emotions
  • Anger
  • Happy
  • Optimistic
  • Pessimistic
  • Tiredness
  • Fear (e.g. of failure)
Experience
  • Lessons from past situations - own experience and feedback
  • What has been learnt recently
  • Complacency due familiarity
Strategy
  • Your own vs someone else's
  • Aggressive
  • Military campaign - lots of detailed planning
  • Reactive
The factors and the weight given to them might be different for:
  • Stakeholder view ("Upgrade needs to be reliable", "Of the new feature set only x is live in the quarter")
  • Tester view ("Which risks are most important to look at now?")
  • Developer view ("I did a fix, can you test that first?")

The stakeholder, developer, tester and any other role in the project has a set view on priorities and aims with the project - agendas maybe - and one challenge is in trying to tie these together, or at least understand the differences and how they impact our communication. They all may have similar product goals but their interpretations to their work may be different - their influences and micro-decisions will be different meaning that transparency in communication is important. 

But, there's a catch - the way we present information can affect its interpretation - depending upon the frame that a stakeholder is adopting.

Think of a frame as a filter through which someone looks at a problem - they're taking in lots of data but only the data that gets through the filter gets attention (the rest may end up in the subconscious for later or isn't absorbed), "I have my show-stopper filter on today so I don't notice the progress the team has made…" 

So, being aware of the potential different types of frames that each project member might have as well as some traps associated with frame formulation is important.

Stakeholder Frames
Might include:
  • Emphasizing minimum time for product delivery
  • Emphasizing short iteration times and delivering quickly
  • Trying to minimize cost of product development (cost of testing?)
  • Emphasizing future re-use of the development environment (tendency to worship automation?)
  • Aiming for a reduced future maintenance cost
Tester Frames
Might include:
  • Emphasizing the favourite test approach
  • Emphasizing areas of greatest risk (to?)
  • Emphasizing the last successful heuristic that found a show-stopper bug
  • Emphasizing focus on the data configuration that found the most bugs the last time
  • Emphasizing conformance to a standard over a different aspect of the product
  • Emphasizing the backlog item that seems the most interesting
  • Emphasizing widespread regression as "fear of failure / breaking legacy" affects analysis
  • Emphasizing feature richness over stability
Note, in isolation, some of these frames may be good, but they might not necessarily be good enough.

Framing Problems in Testing

Functional Blindness or Selective Perception

Russo and Schoemaker called it Functional Blindness. This is the tendency to frame problems from your own area of work or study. Dearborn and Simon called this Selective Perception, ref [3], where they noted that managers often focus their attention on areas that they are familiar with - sales executives focussing on sales as a top priority and production executives focussing on production.

In testing this may translate into:
  • Testers with mainly performance test experience focussing on those areas
  • Recent customer support experience leading to a preference to operational and configuration aspects
  • A generalist spreading effort evenly in many areas
Sunk-Cost Fallacy

This is the tendency to factor in previous investments to the framing of the problem, link. A good example is James Bach's Golden Elephant Syndrome, ref [4].

In testing this may translate into:
  • The latest favourite tool or framework of the execs must be used as there has been so much investment in it.
Over-Confidence

As we've seen above there can be many different ways of framing the problem. It's important to be aware of this. There is a trap that testers can think they've done everything they need - their model/s was the most adequate in this situation. 

Here the warning is against complacency - re-evaluate periodically and tell the story against that assessment. It may be that an issue you find during testing affects some of your initial assumptions - the approach might be good, but maybe it could be better. 
(It might be that you can't change course/approach even if you wanted to, but that's good information for reporting to the stakeholder - areas for further investigation.)
Whatever your model, it's one model. Is it good enough for now? What does it lack - what product blind spots does it have?

Measurements and Numbers

Decision frames and framing sometimes uses a way of measuring whether the frame is good or useful - or whether alternatives are equal. There is a danger here when numbers and measurements get involved.

In business and everyday life there can be occasions when figures and measurements are presented as absolutes  and other times when they're presented are relative figures. They can be misleading in both cases, especially when not used consistently. 

Project stakeholders are probably very familiar with looking at project cost and overrun in absolute and relative terms - depending on how they want the information to shine.

So it's very easy for testers to be drawn into the numbers game - and even play it in terms of absolute or relative figures.
  • "This week we have covered 50% of the product"
  • "Our bug reports have increased 400% compared to the previous project"
  • "The number of tests to run is about 60% of the last project"
  • "5 bug reports have been implemented in this drop"
  • "All pre-defined tests have been run"
As you can (hopefully) see this is just data - not necessarily information that can be interpreted. So, beware of number usage traps in the problem analysis and formulation - both in those given to you and in those you send out,

Another aspect of problems with numbers and decision framing can be thought of as the certainty effect, ref [6]. This can affect how we frame problems - and even how we should communicate.

Frames and Framing Need Maintenance

Analyze and periodically check that your assumptions are correct. Sometimes the emphasis of the project changes - the problem to solve changes. Is the frame still right or correct? Are the parameters of the problem still the same, are the reference points and ways in which to measure or judge the frame - are they the same - if not, time to re-evaluate.

Working with Frames
  • What frames do you and your project / organization start with? (Subconcious default)
  • Are there alternative frames to consider? How many were investigated?
  • Look at what each frame includes and excludes
  • What is the best frame fit for the situation / project? (Do all involved agree on the 'good enough' frame?)
References
[1] The Framing of Decisions and the Psychology of Choice (Tversky & Kahneman, Science Vol 211, No. 4481)

[2] Decision Traps: The Ten Barriers to Brilliant Decision-Making and How to Overcome Them (Russo, Schoemaker, Fireside,1990)

[3] Selective Perception: A Note on the Departmental Identifications of Executives (Dearborn, Simon, Sociometry Vol 21, No 2, June 1958)

[4] James Bach "Golden Elephant" Syndrome (Weinberg, Perfect Software: And Other Illusion about Testing, Dorset House, 2008, p. 101)

[5] Calculated Risks: How to Know When Numbers Deceive You (Gigerenzer, Simon and Schuster, 1986)

1 comment:

  1. Great work, Simon! This is the most complete post I've seen on framing. As you describe it's useful not only knowing how frames may affect your own testing but also how other stakeholder's frames may affect decisions and the way of work in a project.

    ReplyDelete