The Nature of Common Errors in Decision-Making
How do you save yourself from yourself? - Tom Selleck in ‘Blue Bloods’
Largely based on the work of Nobel Laureate Daniel Kahneman and his associates, there is a broad consensus that the human brain consists of two systems. System I makes decisions automatically, quickly, and effortlessly, while System II requires effort, is thoughtful, and decides what to think about and what to do.
In the field, administrators build up a large, diverse repertoire of experiences in first-hand contact with reality. As these are new experiences, they extensively rely on System II. However, using System II is effortful and requires as lot of energy and attentional resources.
Thus, the brain starts to learn patterns and builds up a sort of “muscle memory" in our brains. Regularly practicing making decisions in near similar situations has empowered us to make decisions without having to think much about it.
This energy-saving autopilot heralds the switch from System II to System I.
There is however a cost: developing several systematic biases. This means that these biases are embedded, and we are unaware of them during the decision-making process. Below are some models that outline some common biases (there are more than 90).
Confirmation Bias
Often, administrators believe that their long years of experience have built up an infallible knowledge of how to fix problems. They then pick information that fits into their existing experiences and pay less attention to information that challenges their knowing how to solve problems. As a result, they ignore better alternatives, leading to sub-optimal results.
Sunk-cost Bias
Achieving results requires effort, time and resources. This investment blinds administrators to changing reality; even after overwhelming evidence shows that the programme is going astray, they continue with existing courses of action. They focus on the past and the time and effort spent, rather than what they will get out of it in the future. An important consideration is that the resources already invested will be wasted if course is changed mid-way.
Anchor Bias Model
Here, administrators take an initial position and rest their decisions on this stand and fail to sufficiently move away from their first point of view. All administrators are aware of the stickiness of the first noting on file, and as noted by Paul Appleby in his 1953 report on public administration, this is more unconscious than conscious. The “view of the man at the bottom of the hierarchy who writes the first note on a file is all important in most instances”.
One simple way to reduce the effect of these biases is Distanced Self-Talk. In distanced self-talk, you use your name in the second person pronoun. Most of us have the experience of finding it much easier to give advice to other people than to take that advice ourselves. When you use your own name and “you”, it gets you to think how another person is making decisions, which makes it easier to notice and address the bias. Disassociating ourselves from the problems in this way allows us to develop an objectivity that is required to (at least somewhat) get over our cognitive biases.