What the Unusual History of Anesthetics Can Teach Us About Innovation
In
the late 18th century, something strange
happened: A revolution in medicine that should have taken place
didn't.
It
started in 1772, when Joseph Priestley, an English theologian,
philosopher, and chemist, synthesized nitrous oxide. A few years
later, Humphry Davy, another lauded chemist, noted that nitrous oxide
"appears capable of destroying physical pain." He
recommended that "it may probably be used with advantage during
surgical operations."
The
British upper class had a different idea. The laugh-inducing compound
became a popular recreational drug--laughing-gas parties were
commonplace--but for nearly half a century, the medical community
ignored Davy's suggestion. Not one doctor, or any of those giggling
British aristocrats, thought to use nitrous oxide to dull the pain of
surgery.
It
took two dentists from the United States to turn the tide. Horace
Wells, who managed a practice in New Haven, had experimented with
ether, but it was William Morten of Boston who is credited with
demonstrating that ether could work as an anesthetic. On October 16,
1846, Morten administered the gas to a middle-aged piano teacher from
Boston, which allowed a large tumor to be removed from his neck
without pain. "Gentlemen, this is no humbug," declared John
C. Warren, a prominent surgeon at the Massachusetts General Hospital
who oversaw the experiment. Modern anesthetics had finally emerged.
And
yet, when anesthesia was first employed in London in 1846, it was
labeled a "Yankee dodge." "Most of the characteristics
the surgeons had developed--the indifference, the strength, the
pride, the sheer speed--were suddenly irrelevant," David Wootton
writes in Bad
Medicine.
Using anesthesia felt like cheating, so few doctors did it--at least
at first.
The
history of innovation rarely unfolds in straight lines.
If
anything it zigzgas and occasionally backtracks. Anesthesia is
almost comically circuitous. Compared with other innovations,
however, its implementation was swift. The benefits were visible
and immediate--even a minor surgery in those days caused patients
to scream and flail violently--so it spread like a contagion.
Usually, the invention- implementation gap is wider.
For
instance, in his early teens Blaise Pascal constructed the first
mechanical calculator, ideal for a 17th-century accountant though
largely ignored for 250 years. Sanctorius Sanctorius invented the
thermometer in the 17th
century but clinicians didn't think to use it until the 19th
century. As Peter Bernstein notes in Against
the Gods,
the ancient Greeks were wily mathematicians who played gambling
games with dice. The conditions were ripe for the discovery of
probability and statistics. These, too, emerged much later.
Yet
when we think about innovation, we think in terms of narratives. We
want a story, so we weave the facts into a familiar plot and cast
the innovator in the main role. There's the lone inventor (Tesla),
the censored genius (Galileo), the lackluster student with a
vibrant imagination (Einstein), the scientist on a quest (Darwin),
and the mercurial visionary (Jobs). The innovator is, invariably, a
protagonist contending with authority--an unruly parent, the
academy, or maybe insecurities and self-doubt. Pick your antagonist
and don't forget the eureka moment. What good is a story without
redemption?
The
problem is that good storytelling can seduce clear thinking and
ignores accidental discovery. It gleans over the details and clouds
our ability to understand how innovation unfolds in the present.
For instance, in 1985 reporters at TIME magazine criticized Steve
Jobs as "the brash, brilliant and sometimes bumptious brat of
Silicon Valley." In 2011, a writer for Forbes
praised
Jobs by citing the same traits. "I'm not a jerk like Jobs was.
Which is ... why I'm just a moderately successful business guy, and
not a super billionaire."
We
wonder if the writers were concerned about creating a story that
they wanted to tell at the expense of getting the facts right. When
times were bad, Jobs's obsessive and brutal leadership style was
criticized. When times were good, it was celebrated as necessary
evil. Can both be true? Robert Sutton got it right when "He
was so hyped, so complex, and apparently inconsistent that the
'lessons' [admirers learned] from him were really more about who
they were and hoped to be than about Jobs himself."
We
also think in terms of narratives when we talk about companies. If
a company encourages employees to, say, work from home,
commentators will endorse the rule if the company does well ("Their
innovative and unrestrictive culture led to great new products and
a boom in sales") and lambaste it if the company does poorly
("Their laissez-faire culture led to a string of lackluster
products and a drop in sales").
This
is not to say that Jobs's irreverent approach and working from home
are meaningless. They're not. In the search for "timeless,
universal answers that can be applied by any organization," as
Jim Collins puts it in his mega bestseller Good
to Great,
we descend on a story that sounds good instead of scrutinizing the
details. Does working
from home affect performance? Did
Jobs's
"reality distortion field" help? It's hard to know. I
The
Halo Effect,
Phil Rosenzweig writes that "unless the data were gathered in
a way that was truly independent
of performance ...
we really don't have an explanation of performance at all."
Business
books have done very little for innovation.
Yet
the tendency to mimic the lessons advertised in business books and
expect a susequent boost to creative output is hard to resist. A
good writer provides memorable anecdotes, and there's a good
anecdote for just about anything.
Readers don't realize it, but
they tend to conflate the sound bite for objective fact.
We
might be in a better position if we avoided the business aisle.
Commentators sometimes selectively report the facts and talk about
the history of innovation as if innovators knew where they were was
going. Minus in a few exceptional cases, such as the Manhattan
project and the Apollo program, they were flying blind.
We should
also be skeptical of how innovators report the past. They, too, are
prone to creating their own histories. Instead of searching for a
magic formula, businesses should do what innovators did: embrace
trial and error.
Ironically,
it's hard to convincingly advertise this approach without telling a
good story or pointing to a successful company--we're looking at
you, Google and Apple. Better, here are three book recommendations
so readers can make up their own minds.
The
first is The
Idea Factory: Bell Labs and the Great Age of American Innovation by
Jon Gertner. Perhaps no other company witnessed the value of
trial
and error more than AT&T during the 20th
century. Bell Labs, which AT&T funded, was humming with new
ideas for decades. Interestingly, Bell Labs put off patenting the
laser in the 1960s because it didn't see the commercial potential.
Even eminent innovators, perhaps operating under the delusion that
an invention should be obvious, struggle to recognize good ideas.
In The
Hard Thing About Hard Things,
Ben Horowitz explains how leaders can adopt a trial-and-error
approach. And finally, check out Nassim Taleb's Antifragile:
Things That Gain From Disorder.Taleb
writes that innovation is not about applying theory to search for
something that everyone is looking for. Rather, it's about putting
the existing pieces in the right place, as was done for
anesthetics.
BY
250
WORDShttp://www.inc.com/250-words/blindsight-what-the-unusual-history-of-anesthetics-can-teach-us-about-innovation.html?cid=em01020week41e
No comments:
Post a Comment