“I was looking at it from the wrong side! I could have saved myself so much time if I looked from the other side”
In our retro today we talked about the example of getting blindsided by familiarity and assumptions. One of the team shared that he was looking at a problem from the wrong angle. He thought he was looking at a bug, because we were returning partial information in an automated call (ironic). The problem presented turned out to be an issue in appropriating a piece of older code written for one function, and not the new function that it was being called on to serve. Pretty obvious right? Unfortunately, the pattern of being fixated on what’s in front of us at the time doesn’t always mean we’re taking in the broader picture. Instead of taking a step back, suspending judgement and engaging with the problem through fresh eyes, we got blindsided. Our narrow outlook prevented looking at the problem from beyond the obvious.
In my nephew’s weekly school bulletin, I learned the story of Abraham Wald.
In World War II, the RAF undertook a study of planes returned from their missions, they counted up all the bullet holes on various parts. The planes showed similar concentrations of damage. The High Command wanted to place the armour in areas that had a density of damage.
Abraham Wald, a statistician, pointed out a critical flaw in the analysis: the Command had only looked at airplanes which had returned. He proposed a counterintuitive statistical reasoning. He explained that if a plane made it back safely with, say, bullet holes in the fuselage, it meant those bullet holes weren’t the most damaging. Armour was needed on the sections that, on average, had few bullet holes such as the cockpit or the engines. Planes with bullet holes in those parts never made it back.
This oversight happens when we look at one thing, with bias and without questioning whether we are looking at it the right way or indeed whether we have the full picture at hand. A visual like this above drives us to think about what is beyond the obvious, avoiding the temptation to jump to conclusions. Opportunities to strengthen our position come from reviewing these historical data points, observing the patterns and asking what’s not there. We must be aware of the temptation and trap of nearsightedness and moving beyond to see a fuller picture, no matter how blurry or how vulnerable we may feel.
In any crisis, one of our challenges is to think about what we are looking at, to think about where we should act – in the obvious or to reason, to question and to learn. In our successes, we learn what we do well, in our losses we see our true vulnerabilities.
Another take on this is to Invert, always, invert
“The algorithm for inversion is very simple:
- Define the problem – what is it that you’re trying to achieve?
- Invert it – what would guarantee the failure to achieve this outcome?
- Finally, consider solutions to avoid this failure”
Survivor bias is the concentration on people or things that made it past some selection process whilst overlooking those that did not typically because of their lack of visibility. This can lead to false conclusions in several different ways. Avoid the tunnel vision that comes from only looking at only one metric. View a situation from multiple angles, a picture enriched with insights from several metrics.
This year has exposed our teams to our own vulnerabilities, but we have also learnt about our strength and our resilience. When we look beyond the obvious, we look for the fuller picture and get to see a much more insightful picture for success.
Umano is on a mission to help self-managed agile delivery teams perform at their best by providing analytics and automated assistance to guide their continuous improvement.
Sign up here to access your complimentary Umano account and see how your team’s agile sprint practices are tracking.