One of the learners on an organization theory program I am teaching said that he was interested in the course because "specifically I am trying to understand why some of the most grievous mistakes that I have seen made are being made by those who know better, but are overconfident and continue on in the face of issues that they normally would have taken as signals to stop the job."
The other day I mentioned Gary Klein's book Sources of Power: How People Make Decisions. (See Toyota piece). Now he has come up on my radar again because he has also written a couple of other books more specifically on intuition and decision making, the most recent being Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making, and he also appears in the current issue of McKinsey Quarterly (that I was reading just before I saw the piece from the learner) where he and Daniel Kahneman 'debate the power and perils of intuition for senior executives'
The connection between intuition and over-confidence is made by Klein in this discussion with Kahneman. When asked: "Under what conditions are the intuitions of professionals worthy of trust?"
Klein replied: It depends on what you mean by "trust." If you mean, "My gut feeling is telling me this; therefore I can act on it and I don't have to worry," we say you should never trust your gut. You need to take your gut feeling as an important data point, but then you have to consciously and deliberately evaluate it, to see if it makes sense in this context. You need strategies that help rule things out. That's the opposite of saying, "This is what my gut is telling me; let me gather information to confirm it."
Kahneman follows up saying: There are some conditions where you have to trust your intuition. When you are under time pressure for a decision, you need to follow intuition. My general view, though, would be that you should not take your intuitions at face value. Overconfidence is a powerful source of illusions, primarily determined by the quality and coherence of the story that you can construct, not by its validity. If people can construct a simple and coherent story, they will feel confident regardless of how well grounded it is in reality.
Turning to the RSA Journal (also in the pile of things I am reading) was the article by John Kay, from his book Obliquity. Kay tells the story of the hedgehog who knows one big thing and the fox who knows many little things. He reports that Philip Tetlock, a political economist used this to study political judgments, noting that:
But Tetlock's most striking discovery is that, although the foxes perform better in terms of the quality of their judgments, the hedgehogs perform better in terms of public acclaim. Hedgehogs are people who know the answers. Foxes know the limitations of their knowledge. The confident certainties of hedgehogs attract the attention of politicians and business leaders. Give me a one-handed economist, goes the saying, but careful judgment is often a matter of 'on the one hand, and on the other'. Yet hedgehogs, who claim to predict the future, will always attract a larger audience than foxes, who acknowledge they can't, even if the larger audience learns nothing useful from the predictions.
This brings me back (almost) full circle to my blog yesterday on Dunbar's number, and the concept of 'opinion leaders'. From mashing the theories (Klein's, Kahneman's, Kay's, and Dunbar's) emerges a hypothesis that there are opinion leaders, i.e. hedgehogs, who (over-confidently) rely on their intuition to make decisions and these decisions are not good ones. Indeed Kay makes the point that "It is hard to overstate the damage that has recently been done by people who thought they knew more about the world than they really did." And he goes on to give some examples.
Assuming that there is something to the hypothesis (the learner could test it) what would stop leaders from leaping into an over-confident decision and not taking time to review the outcomes of the decision and stop things if they were going wrong? The articles present three suggestions:
- Select leaders differently. Kahneman makes the point that "One of the real dangers of leader selection in many organizations: leaders are selected for overconfidence. We associate leadership with decisiveness. That perception of leadership pushes people to make decisions fairly quickly, lest they be seen as dithering and indecisive."
- Teach people to continuously challenge their assumptions.
- Do the 'post mortem exercise'. Before acting on a decision, people should say, "We're looking in a crystal ball, and this decision was wrong, it failed to deliver, it was a fiasco. Now, everybody, take two minutes and write down all the reasons why you think the decision failed."