Strategic decisions: When can you
trust your gut?
Nobel
laureate Daniel Kahneman and psychologist Gary Klein debate the power and
perils of intuition for senior executives.
For two scholars representing opposing schools of thought, Daniel
Kahneman and Gary Klein find a surprising amount of common ground. Kahneman, a
psychologist, won the Nobel Prize in economics in 2002 for prospect theory,
which helps explain the sometimes counterintuitive choices people make under
uncertainty. Klein, a senior scientist at MacroCognition, has focused on the
power of intuition to support good decision making in high-pressure
environments, such as firefighting and intensive-care units.
In a September 2009 American
Psychology article titled “Conditions for intuitive expertise: A
failure to disagree,” Kahneman and Klein debated the circumstances in which
intuition would yield good decision making. In this interview with Olivier
Sibony, a director in McKinsey’s Brussels office, and Dan Lovallo, a professor
at the University of Sydney and an adviser to McKinsey, Kahneman and Klein
explore the power and perils of intuition for senior executives.
The Quarterly: In your recent American Psychology article,
you asked a question that should be interesting to just about all executives:
“Under what conditions are the intuitions of professionals worthy of trust?”
What’s your answer? When can executives trust their guts?
Gary Klein: It depends on what you mean by “trust.” If you
mean, “My gut feeling is telling me this; therefore I can act on it and I don’t
have to worry,” we say you should never trust your gut. You need to take your
gut feeling as an important data point, but then you have to consciously and
deliberately evaluate it, to see if it makes sense in this context. You need
strategies that help rule things out. That’s the opposite of saying, “This is
what my gut is telling me; let me gather information to confirm it.”
Daniel Kahneman: There are some conditions where you have to trust
your intuition. When you are under time pressure for a decision, you need to
follow intuition. My general view, though, would be that you should not take
your intuitions at face value. Overconfidence is a powerful source of
illusions, primarily determined by the quality and coherence of the story that
you can construct, not by its validity. If people can construct a simple and
coherent story, they will feel confident regardless of how well grounded it is
in reality.
The Quarterly: Is intuition more reliable under certain
conditions?
Gary Klein: We identified two. First, there needs to be a
certain structure to a situation, a certain predictability that allows you to
have a basis for the intuition. If a situation is very, very turbulent, we say
it has low validity, and there’s no basis for intuition. For example, you
shouldn’t trust the judgments of stock brokers picking individual stocks. The
second factor is whether decision makers have a chance to get feedback on their
judgments, so that they can strengthen them and gain expertise. If those
criteria aren’t met, then intuitions aren’t going to be trustworthy.
Most corporate decisions aren’t going to
meet the test of high validity. But they’re going to be way above the
low-validity situations that we worry about. Many business intuitions and
expertise are going to be valuable; they are telling you something useful, and
you want to take advantage of them.
Daniel Kahneman: This is an area of difference between Gary and me.
I would be wary of experts’ intuition, except when they deal with something
that they have dealt with a lot in the past. Surgeons, for example, do many
operations of a given kind, and they learn what problems they’re going to
encounter. But when problems are unique, or fairly unique, then I would be less
trusting of intuition than Gary is. One of the problems with expertise is that
people have it in some domains and not in others. So experts don’t know exactly
where the boundaries of their expertise are.
The Quarterly: Many executives would argue that major strategic
decisions, such as market entry, M&A, or R&D investments, take place in
environments where their experience counts—what you might call high-validity
environments. Are they right?
Gary Klein: None of those really involve high-validity
environments, but there’s enough structure for executives to listen to their
intuitions. I’d like to see a mental simulation that involves looking at ways
each of the options could play out or imagining ways that they could go sour,
as well as discovering why people are excited about them.
Daniel Kahneman: In strategic decisions, I’d be really concerned
about overconfidence. There are often entire aspects of the problem that you
can’t see—for example, am I ignoring what competitors might do? An executive
might have a very strong intuition that a given product has promise, without
considering the probability that a rival is already ahead in developing the
same product. I’d add that the amount of success it takes for leaders to become
overconfident isn’t terribly large. Some achieve a reputation for great
successes when in fact all they have done is take chances that reasonable
people wouldn’t take.
Gary Klein: Danny and I are in agreement that by the time
executives get to high levels, they are good at making others feel confident in
their judgment, even if there’s no strong basis for the judgment
The Quarterly: So you would argue that selection processes for
leaders tend to favor lucky risk takers rather than the wise?
Daniel Kahneman: No question—if there’s a bias, it’s in that
direction. Beyond that, lucky risk takers use hindsight to reinforce their
feeling that their gut is very wise. Hindsight also reinforces others’ trust in
that individual’s gut. That’s one of the real dangers of leader selection in
many organizations: leaders are selected for overconfidence. We associate
leadership with decisiveness. That perception of leadership pushes people to
make decisions fairly quickly, lest they be seen as dithering and indecisive.
Gary Klein: I agree. Society’s epitome of credibility is John
Wayne, who sizes up a situation and says, “Here’s what I’m going to do”—and you
follow him. We both worry about leaders in complex situations who don’t have
enough experience, who are just going with their intuition and not monitoring
it, not thinking about it.
Daniel Kahneman: There’s a cost to not being John
Wayne, since there really is a strong expectation that leaders will be decisive
and act quickly. We deeply want to be led by people who know what they’re doing
and who don’t have to think about it too much.
The Quarterly: Who would be your poster child for the “non–John
Wayne” type of leader?
Gary Klein: I met a lieutenant general in Iraq who told me a
marvelous story about his first year there. He kept learning things he didn’t
know. He did that by continuously challenging his assumptions when he realized
he was wrong. At the end of the year, he had a completely different view of how
to do things, and he didn’t lose credibility. Another example I would offer is
Lou Gerstner when he went to IBM. He entered an industry that he didn’t
understand. He didn’t pretend to understand the nuances, but he was seen as
intelligent and open minded, and he gained trust very quickly.
The Quarterly: A moment ago, Gary, you talked about imagining
ways a decision could go sour. That sounds reminiscent of your “premortem”
technique. Could you please say a little more about that?
Gary Klein: The premortem technique is a sneaky way to get
people to do contrarian, devil’s advocate thinking without encountering
resistance. If a project goes poorly, there will be a lessons-learned session
that looks at what went wrong and why the project failed—like a medical
postmortem. Why don’t we do that up front? Before a project starts, we should
say, “We’re looking in a crystal ball, and this project has failed; it’s a
fiasco. Now, everybody, take two minutes and write down all the reasons why you
think the project failed.”
The logic is that instead of showing
people that you are smart because you can come up with a good plan, you show
you’re smart by thinking of insightful reasons why this project might go south.
If you make it part of your corporate culture, then you create an interesting
competition: “I want to come up with some possible problem that other people
haven’t even thought of.” The whole dynamic changes from trying to avoid
anything that might disrupt harmony to trying to surface potential problems.
Daniel Kahneman: The premortem is a great idea. I mentioned it at
Davos—giving full credit to Gary—and the chairman of a large corporation said
it was worth coming to Davos for. The beauty of the premortem is that it is
very easy to do. My guess is that, in general, doing a premortem on a plan that
is about to be adopted won’t cause it to be abandoned. But it will probably be
tweaked in ways that everybody will recognize as beneficial. So the premortem
is a low-cost, high-payoff kind of thing.
The Quarterly: It sounds like you agree on the benefits of the
premortem and in your thinking about leadership. Where don’t you see eye to
eye?
Daniel Kahneman: I like checklists as a solution; Gary doesn’t.
Gary Klein: I’m not an opponent of checklists for
high-validity environments with repetitive tasks. I don’t want my pilot
forgetting to fill out the pretakeoff checklist! But I’m less enthusiastic
about checklists when you move into environments that are more complex and
ambiguous, because that’s where you need expertise. Checklists are about
if/then statements. The checklist tells you the “then” but you need expertise
to determine the “if”—has the condition been satisfied? In a dynamic, ambiguous
environment, this requires judgment, and it’s hard to put that into checklists.
Daniel Kahneman: I disagree. In situations where you don’t have
high validity, that’s where you need checklists the most. The checklist doesn’t
guarantee that you won’t make errors when the situation is uncertain. But it
may prevent you from being overconfident. I view that as a good thing.
The problem is that people don’t really
like checklists; there’s resistance to them. So you have to turn them into a
standard operating procedure—for example, at the stage of due diligence, when
board members go through a checklist before they approve a decision. A
checklist like that would be about process, not content. I don’t think you can
have checklists and quality control all over the place, but in a few strategic
environments, I think they are worth trying.
The Quarterly: What should be on a checklist when an executive is
making an important strategic decision?
Daniel Kahneman: I would ask about the quality and independence of
information. Is it coming from multiple sources or just one source that’s being
regurgitated in different ways? Is there a possibility of group-think? Does the
leader have an opinion that seems to be influencing others? I would ask where
every number comes from and would try to postpone the achievement of group
consensus. Fragmenting problems and keeping judgments independent helps
decorrelate errors of judgment.
The Quarterly: Could you explain what you mean by “correlated
errors”?
Daniel Kahneman: Sure. There’s a classic experiment where you ask
people to estimate how many coins there are in a transparent jar. When people
do that independently, the accuracy of the judgment rises with the number of
estimates, when they are averaged. But if people hear each other make
estimates, the first one influences the second, which influences the third, and
so on. That’s what I call a correlated error.
Frankly, I’m surprised that when you have
a reasonably well-informed group—say, they have read all the background
materials—that it isn’t more common to begin by having everyone write their
conclusions on a slip of paper. If you don’t do that, the discussion will
create an enormous amount of conformity that reduces the quality of the
judgment.
The Quarterly: Beyond checklists, do you disagree in other
important ways?
Gary Klein: Danny and I aren’t lined up on whether there’s
more to be gained by listening to intuitions or by stifling them until you have
a chance to get all the information. Performance depends on having important
insights as well as avoiding errors. But sometimes, I believe, the techniques
you use to reduce the chance of error can get in the way of gaining insights.
Daniel Kahneman: My advice would be to try to postpone intuition as
much as possible. Take the example of an acquisition. Ultimately, you are going
to end up with a number—what the target company will cost you. If you get to
specific numbers too early, you will anchor on those numbers, and they’ll get
much more weight than they actually deserve. You do as much homework as
possible beforehand so that the intuition is as informed as it can be.
The Quarterly: What is the best point in the decision process for
an intervention that aims to eliminate bias?
Daniel Kahneman: It’s when you decide what information needs to be
collected. That’s an absolutely critical step. If you’re starting with a
hypothesis and planning to collect information, make sure that the process is
systematic and the information high quality. This should take place fairly
early.
Gary Klein: I don’t think executives are saying, “I have my
hypothesis and I’m looking only for data that will support it.” I think the
process is rather that people make quick judgments about what’s happening,
which allows them to determine what information is relevant. Otherwise, they
get into an information overload mode. Rather than seeking confirmation,
they’re using the frames that come from their experience to guide their search.
Of course, it’s easy for people to lose track of how much they’ve explained
away. So one possibility is to try to surface this for them—to show them the
list of things that they’ve explained away.
Daniel Kahneman: I’d add that hypothesis testing can be completely
contaminated if the organization knows the answer that the leader wants to get.
You want to create the possibility that people can discover that an idea is a
lousy one early in the game, before the whole machinery is committed to it.
The Quarterly: How optimistic are you that individuals can debias
themselves?
Daniel Kahneman: I’m really not optimistic. Most decision makers
will trust their own intuitions because they think they see the situation
clearly. It’s a special exercise to question your own intuitions. I think that
almost the only way to learn how to debias yourself is to learn to critique
other people. I call that “educating gossip.” If we could elevate the gossip
about decision making by introducing terms such as “anchoring,” from the study
of errors, into the language of organizations, people could talk about other
people’s mistakes in a more refined way.
The Quarterly: Do you think corporate leaders want to generate
that type of gossip? How do they typically react to your ideas?
Daniel Kahneman: The reaction is always the same—they are very
interested, but unless they invited you specifically because they wanted to do
something, they don’t want to apply anything. Except for the premortem. People
just love the premortem.
The Quarterly: Why do you think leaders are hesitant to act on
your ideas?
Daniel Kahneman: That’s easy. Leaders know that any procedure they
put in place is going to cause their judgment to be questioned. And whether
they’re fully aware of it or not, they’re really not in the market to have
their decisions and choices questioned.
The Quarterly: Yet senior executives want to make good decisions.
Do you have any final words of wisdom for them in that quest?
Daniel Kahneman: My single piece of advice would be to improve the
quality of meetings—that seems pretty strategic to improving the quality of
decision making. People spend a lot of time in meetings. You want meetings to
be short. People should have a lot of information, and you want to decorrelate
errors.
Gary Klein: What
concerns me is the tendency to marginalize people who disagree with you at
meetings. There’s too much intolerance for challenge. As a leader, you can say
the right things—for instance, everybody should share their opinions. But
people are too smart to do that, because it’s risky. So when people raise an
idea that doesn’t make sense to you as a leader, rather than ask what’s wrong
with them, you should be curious about why they’re taking the position. Curiosity
is a counterforce for contempt when people are making unpopular statements.
http://www.mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/strategic-decisions-when-can-you-trust-your-gut?cid=other-eml-cls-mkq-mck-oth-1609
No comments:
Post a Comment