What Ostriches Can Teach Us About Risk
A lack of awareness and
preparation is a big part of why hurricanes, floods and other calamities leave
such chaos behind. But what if we could change the way we think about risk?
That’s the subject of The Ostrich Paradox: Why We
Underprepare for Disasters, a new book by Wharton professors Howard
Kunreuther and Robert Meyer, who co-direct the Wharton Risk Management
and Decision Processes Center.
Kunreuther, a professor of operations, information and decisions, and Meyer, a
marketing professor, stopped by Knowledge@Wharton to talk about the book and
their solution for helping policymakers, governments and individuals to create
more prepared communities.
An edited transcript of the
conversation follows.
Knowledge@Wharton: I wanted to ask you about the title of the book. I
assumed it meant that we should try to avoid ostrich-like behavior, that we
shouldn’t stick our heads in the sand when it comes to disasters. But it turns
out, that’s not quite the case. Could you tell us more?
Robert Meyer: We were very much interested in writing about why
it is that we can’t seem to get a hold on disasters. Even though scientific
abilities to forecast hurricanes and so forth have grown considerably over the
years, that hasn’t really been matched with any noticeable decrease in the
costs, both in terms of lives and monetary losses, from natural disasters or
man-made disasters. Often, the reason that’s given whenever that happens, we
say, “Well, people are putting their heads in the sand. They’re ostriches.”
That’s sort of the old cartoon notion of what an ostrich is. When threat is
coming, a lion is coming, they dig their head in the sand and pretend it
doesn’t exist.
It turns out that ostriches have been given a
bum rap for that over the years, because ostriches are incredibly good at
dealing with risk. They have enormous limitations. They can’t fly. But nature
has allowed them to overcome that by having enormous ground speed and all sorts
of risk-avoidance strategies.
It then occurred to us that therein might be
a key to which we can get people to be better at preparing for risk. Rather
than being less like ostriches, they actually need to become more like
ostriches. To first start off and say, “What are the limitations that we have?
What are the psychological limitations that prevent us from being better at
preparing for disasters?” And once we understand what those are, think about
ways in which we can better adapt to those limitations so we’re more like
ostriches rather than less. Hence the paradox.
Howard Kunreuther: We do bury our heads in the sand often. In some
sense, we felt that by using the ostrich as an analogy, we might be in a
position for people to pay attention. I think the real challenge is for people
to pay attention before something happens, rather than afterwards. Ostriches
are very good at doing that, and we hope human beings can do a better job than
they have up until now.
Knowledge@Wharton: The book starts with a really interesting example
from Galveston, Texas. It was two hurricanes, which occurred about 100 years
apart. But there was the same reaction both times, even though technology had
advanced significantly when the second hurricane hit; there was still this same
under-response to the disaster. The book is full of stories like this. Was
there one in particular that inspired it?
Meyer: For me, the thing that inspired it is a story that
is not in the book. I think it was 2008. I was visiting New Orleans and decided
to take a drive along the Mississippi Gulf Coast. I was driving along near the
town of Pass Christian, Mississippi, which is an area that had been devastated
by Hurricane Katrina, and I was noticing how quickly the place had recovered.
There were a lot of nice grass fields. I noticed that in one of the fields was
an ATM machine that was just sitting by itself. I looked at that and pulled
over. I said, “What’s an ATM machine doing out in the middle of the field?” I
kind of walked up to it and began looking around. Then I discovered afterwards
that there used to be a shopping center right on the coast that had been
completely destroyed by Hurricane Katrina. The only thing that was left was the
ATM machine.
It turned out that this shopping center was
built on the exact same spot where a condominium complex used to be in 1969.
Twenty-three people died when the condominium complex was blown away by a
hurricane. I contacted the person who owned the land of the shopping center and
asked, “Did you know that this site had this hurricane history?” He said,
“Yeah, but it had been so many years since there’d been a storm, we didn’t
think it was under much risk.” Then I said, “What do you plan to do with the
property?” And he said, “Well, we’re thinking about selling it and hope to
build up a condominium complex here again.”
I was sitting there and thinking, “There’s
got to be a story there.” Why is it that we don’t learn from experience? Why is
it that we’re not better at thinking ahead in terms of what the consequences
are of threats like this? What are all the different types of psychological
factors that cause us to be so poor at preparing for disasters?
Kunreuther: To follow up on a story in the book that actually
did start it off, is the story of Glenda Moore, who was in a position where she
had to figure out what decision she was going to make after Hurricane Sandy. I
think all of us were very cognizant of the challenges we face after a storm
occurs, in the sense that we know there are things we should do and things that
we might not do. But we don’t really know them as well as we should. Glenda’s
first concern was to be with her husband, who had left [Staten Island] and was
in Brooklyn, and she made a set of decisions without carefully thinking them
out.
Our concern is that, to a large extent, we
react to situations at the moment without necessarily doing an analysis of all
of the possible dangers. The communication mechanisms are sometimes good, but
we sometimes don’t hear them. [Moore] made some decisions to leave that
resulted in her losing her children. There was a whole set of media attention
given to that. Our feeling was, it was a way to start the book, to point out
the fact that we would like everyone to reflect on what could happen after a
disaster and do some steps beforehand to prepare. Also, to listen to the kinds
of things that one is hearing and try to get the right kind of information so
one doesn’t do the things that could result in tragedy.
Knowledge@Wharton: You outline six biases that may lead to some of
this under-preparation for disasters, and you’ve said that perhaps the most
dangerous one of these may be optimism. I thought that was interesting because
we usually think of that as a good thing.
Meyer: Absolutely. I think it is a good thing to have in
life. We’ve been fortunately endowed with an optimistic outlook. The reason we
have these things is they serve a good, functional purpose. Most of the time,
in most of the decisions we face on a day-to-day basis, they basically serve us
well. However, it’s also the case that when we suddenly are in a situation of
facing very rare events — decisions we don’t have to make on a day-to-day basis
— that’s when you start applying these same heuristics, and all of a sudden all
sorts of stuff goes wrong. Optimism is a great case of that. It is good to look
on the bright side of life. However, when you’re talking about something that
might happen to you that might mean the end of your life, then being optimistic
is not necessarily the best thing to do. One of the things we talk about is the
ways in which people become erroneously optimistic in a way that is harmful.
Kunreuther: One of the things with optimism is that we try to
find ways to defend our optimism. We don’t really want to think about a flood
or a hurricane. We’re optimistic because we are living in an area that we say
is a great place for us to live. We want to keep that image alive, so we say,
“The chances of anything happening are so low that we’re not going to really
think about it. It won’t happen to us. It may happen to others. It’s below our
threshold level of concern. We’re not going to worry.”
We keep that optimistic bias in the sense of
trying to avoid having to think about things that we probably should be
thinking about beforehand. Our whole idea in bringing that up, along with the
five other biases, is to say that they’re all connected in some sense. We have
to try and figure out ways to let people know that these are things we all do.
Then we can think about ways to improve the decisions by trying to avoid them
in the future.
Knowledge@Wharton: Your method of improving this is called the behavior
risk audit. This involves looking at each of these biases, turning them on
their heads and figuring out how to create policies that address these biases
and maybe get around them. Can you talk about how people could do this?
Kunreuther: Let me illustrate with one, and Bob may pick
another one. Myopia is one of the biases that we have. We all have short-term
horizons. We want to get immediate returns. If there are things that we can do
for the long term, we often find they are very prohibitively expensive. Let’s
take an example of having to make our house safer against a flood or hurricane.
There’s a lot of cost to doing that. You could elevate your house, but that’s
very costly. You could maybe flood-proof it. People will say, “What are the
benefits that I’m going to get from that in the next period?” They’ll be very
reluctant to put in the money because they say the benefits are going to be
very short run. And they’re right. If you’re going to get a short-run benefit
like a reduction in your insurance premium, you’ll say, “Well, I’m not going to
get enough to pay for that expense.”
We would recommend that instead of thinking
about just the long term with respect to this, there are two things that one
can do. One is, you might give a person a loan to help them out and spread the
cost over time. The other thing to deal with is the “it won’t happen to me.”
Instead of saying, “It’s going to be a one in 100 chance of a flood occurring
next year,” stretch the time and say, “Think about the fact that there might be
a hurricane in the next 30 years, and that likelihood is greater than one in
four, or one in five.” Then, people will think about the long term and maybe
decide that they can take some action.
Meyer: Another one of the biases we talk about is simplification.
The idea there is that most people don’t want to consider lots of different
factors when making decisions. One feature of this is a thing called the
single-action bias. When you’re faced with a problem, as soon as you take one
action to try and solve it, there’s a tendency for your brain to go, “Good.
There was the problem. I took some action. Problem solved.”
In a lot of walks of life, that’s kind of an
OK thing to do. But think about it in the context of preparing for natural
disasters when, in fact, there are large numbers of things that you need to do.
You’re trying to build a safer house. In order to make the house structurally
sound, there are lots of different things that you have to do. Historically,
the way in which agencies like the National Oceanic and Atmospheric
Administration and the Federal Emergency Management Agency have encouraged
people to prepare for disasters is to give them checklists. The checklists will
have 60 items on them. They will say, “Make sure you do this with your dog.”
The single-action bias will say that people are going to ignore most of those
things. But they are going to see the list. They are going to be aware that
there’s a threat. The problem is that they’re going to go down this list
randomly and take care of the dog or whatever. And once they’ve done that,
that’s the end of the list and they feel sort of prepared for the disaster when
they’re really not.
One of the remedies for that is to say, now
that we know people simplify and tend to focus on one action, don’t give them a
massive checklist. Say, “If you’re going to do one thing, here’s the one thing
to do. And once you’ve done that, here’s the second thing to do.” You have to
walk people through that. That’s going to be a much more effective way of getting
people to prepare.
Knowledge@Wharton: Do you know of examples of agencies or governments
or companies that are doing this? Or do they all need to get a copy of the
book?
Kunreuther: I’ll give one example and relate it to what I said
a few moments ago. FEMA is very concerned with communicating risk. They have
actually changed the way they’re presenting information on the flood hazard.
They used to talk about a 100-year flood. They’re now telling people, “Think
about the next 30 years and what could happen to you. You should consider
buying insurance for next year, even though you are thinking about the fact
that there’s a low probability for a disaster to occur.” So, they are trying to
take that seriously. But they have to do this in such a way that they
communicate so people will at least read that, which is a challenge in and of
itself.
Knowledge@Wharton: Could a family or an individual use this behavior
risk audit to prepare for disasters, or is it more for policymakers or
companies?
Meyer: I think it absolutely can be used for individual
households. Often, companies are fairly good at thinking through all the
details and having well-developed risk avoidance plans. For example, a utility
company tends to be very good at this sort of thing. This book is a little more
targeted at organizations that aren’t necessarily experts in terms of risk
preparation. For any given family, you need to sit down and think through, “How
are we as a family are thinking about risk?” These biases apply there.
Howard was giving the example that started
out the book about the Glenda Moore family, where a woman hastily put her
children into a van and lost the children in a hasty evacuation. We would like
to think that this is a book that can help households like that make sure that
those events never occur. What it does is say, “Look, when you’re faced with
disasters, these are the kinds of mistakes that you’re likely to make.” It’s
not because there’s something wrong with you. You’re just human. These are
hard-grained, hard-wired biases that we have. Once we understand what those
are, you can anticipate the kind of mistakes you’re going to make. That’s the
first step in trying to avoid them.
Kunreuther: There are a series of stories that are graphic.
People react to graphic stories in a way that they don’t react to just the
facts. Our hope is that people will read these stories, recognize that it can
happen to them and pay attention to the very last part of the book on ways to
improve.
Then, we would like to use one of our biases
that we hope makes this book more than just one person reading it, and that’s
the herd effect. We hope that people will see this and say, “You know, this is
something I’d like to talk to a few other people about. It isn’t just our
situation; it may be others.” I hope it becomes a topic of conversation that we
have these kinds of biases. We can all think of our own sets of activities that
might reflect that. There’s a tendency, with the kinds of events we’re talking
about, for people not to want to pay attention. Our hope is that the stories
that we tell here will resonate enough so that people will want to actually
think about them.
Knowledge@Wharton: It seems like we hear about hurricanes when they’re
a day or two away, or maybe in the immediate aftermath. Then we don’t hear
about them again until the next one comes along. You also talk about more
long-range events, specifically about climate change. How does behavior risk
audit apply to something like that, which is more of a long-term, broader risk?
Kunreuther: This is a real challenge. I think we recognize that
it’s a challenge we’re facing right now because climate change is not
necessarily being discussed in a way that I think we would like to see it
discussed. It is a critical problem, and people may not be paying attention
right now.
First of all, construct some scenarios that
you begin to see what might happen, and think about how you can take steps now
to avoid those scenarios. Think about the fact that it isn’t just yourselves,
but there’s future generations at stake. What is this going to do to my
children and my great-grandchildren? Try to recognize that there’s a tendency
to say, “This is not going to happen.” It won’t happen necessarily tomorrow,
but there has to be planning. And it isn’t just for the individuals. It’s for
communities.
Meyer: As Howard said, I think long-run risk is very
difficult. Particularly if, let’s say, you’re living in any coastal area and
worried about sea-level rise. You say, “Well, scientists are forecasting that
in 60 to 70 years, this area is going to be underwater.” I’m not going to be
living there in 60, 70 years, so why should I particularly care about that?”
This is particularly the case for communities
that are thinking about doing things like raising taxes and bond issues to put
up public works projects to prevent sea-level rise. The people who are voting
for them and who are going to have to pay these taxes aren’t going to be the
ones who are going to be benefitting from it. People might understand, “Well,
maybe we need to take care of it. But maybe we should just do this next year.”
It’s easy to put off because nothing’s going to change that much. Of course,
what happens is that you keep putting it off until next year, and it never gets
done.
Some of the things we were thinking about
within the book are ways in which you can get people to take safe actions, or
have a culture of safety, in a situation where the uncertainty is on taking
risk rather than not taking risk? For example, you have your city budget and
one of the line items is, “Should we put multimillion-dollars into a pumping
system?” That’s an add-on. Then you have to think through, “Is this a good use
of the money now?”
What if it was the case that every year in
the budget there’s money set aside for infrastructure improvements? But that
can be removed. Now the discussion is, “Do we want to not take a safe action
this year or not?” That might be enough to keep some of those items on the budget
that might have otherwise been removed. The hope is that through those little
sorts of steps, we might get closer to dealing with some of these extremely
difficult long-run problems.
Kunreuther: You could couple what Bob is saying with the notion
of, “Let’s make sure we don’t have the cost immediately on top of us.” Having a
long-term loan so that you spread it over a number of years will make it more
attractive. If you can show that there are benefits that are going to occur,
even in the short run, maybe property values will go up. Maybe people will want
to move into the area, and you can sell your home a little more easily by the
fact that one recognizes the community is taking steps now to avoid this
problem that might be really serious 20 or 30 years from now. I think those are
problems facing coastal cities that really have to be concerned about what is
going to happen to the property values when people say, “We’re not really
taking the steps.”
I think if we can combine these things with
economic incentives, with the default option that Bob mentioned, which is one
of our notions that you have to say, “I’m not going to do this,” rather than,
“I’m going to do this.” And also, to recognize one of the challenges on the
political process. We have a little acronym that we occasionally use: NIMTOFF.
Not in My Term of Office. If we can avoid that and recognize that there are
benefits of doing this in my term of office, then I think we have a chance of
at least getting people to pay attention.
Knowledge@Wharton: What about different kinds of risks, like political
risks? Some of the things that I read in the book reminded me of what you heard
people in the United Kingdom say around the time of Brexit. “Oh, this won’t
happen. It won’t pass.” Or even around the time of the presidential election in
the United States. I feel like there was maybe some myopia or optimism at play
there. Do you think that the principles in the book can apply to different
types of risk other than natural disasters or environmental risk?
Kunreuther: We haven’t thought explicitly about Brexit, to use
that as an example. But I think there is a notion here of a feeling that people
may not necessarily appreciate all the elements of a particular problem. We
could say about Brexit and the election that there was a degree of optimism by
people who felt that they really didn’t have to be concerned about voting in a
particular way on the basis of feeling that this was a foregone conclusion. In
that sense, we come back to the optimism bias.
Meyer: Certainly, you can go through those biases, and
they can provide an explanation for why maybe a lot of people were very
surprised by the last election. Or alternatively, half the country was not at
all surprised, and the other half was extremely surprised. I think the part
that was extremely surprised was influenced by things like herd-think. They
went around, and the people that they talked to were people that had the same
values that they did, and they became of the belief that this is the way the
world is. It’s simplification. There’s a tendency to not look broadly or look
for evidence that maybe there are people out there who don’t think like you do.
When you take all these things together, all of a sudden there’s this
groupthink, which in many cases may have led, for example, Hillary Clinton
supporters to feel they didn’t need to vote and it’s such a foregone
conclusion.
http://knowledge.wharton.upenn.edu/article/ostriches-can-teach-us-risk/?utm_source=kw_newsletter&utm_medium=email&utm_campaign=2017-02-09
No comments:
Post a Comment