Best
Business Books 2015 ON Disruption
A version of this article appeared in the Winter 2015 issue ofstrategy+business.
Martin Ford
Rise of the Robots: Technology and the Threat of a Jobless Future(Basic Books, 2015)
Rise of the Robots: Technology and the Threat of a Jobless Future(Basic Books, 2015)
Walter Isaacson
The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (Simon & Schuster, 2014)
The Innovators: How a Group of Hackers, Geniuses, and Geeks Created the Digital Revolution (Simon & Schuster, 2014)
Paul Vigna and Michael J. Casey
The Age of Cryptocurrency: How Bitcoin and Digital Money Are Challenging the Global Economic Order (St. Martin’s Press, 2015)
The Age of Cryptocurrency: How Bitcoin and Digital Money Are Challenging the Global Economic Order (St. Martin’s Press, 2015)
Just as at the beginning of the Industrial Revolution, outsized angst
over the imminent disappearance of jobs is a rampant concern in our age. When
intelligent humanoid robots strip labor away from the vast majority of the
working-age population, the apprehension goes, society as we know it may not
make it. The Atlantic foretold this prophecy in a cover story titled “The End
of Work”; Foreign Affairs exclaimed simply, “Hi, Robot.” It is
demoralizing, to say the least. In the summer science-fiction chiller Humans, a teenager
wonders why one should aspire to a career in medicine when future robots will
do the job better.
A minority of our thinkers are pushing back: Martin Wolf, the ordinarily
decorous chief economics columnist at the Financial Times,
can barely contain his scorn on this
subject; certainly, no super robot race is imminent, Wolf grumbles. At Quartz, my
colleague Tim Fernholz thinks that the panic is far overdone — not only are
robots not frightening, but we need as many as we can get
to help us become more productive.
Doing the Robot
No one can predict with certainty the outcome of the torrent of fresh
automation washing over us. But this is no faddish debate. As Martin Ford makes
clear in his impressively researched and, yes, frightening Rise of the
Robots: Technology and the Threat of a Jobless Future, the evidence is
ample that artificial intelligence is already occupying jobs previously thought
doable only by humans. That Ford writes in a terse, understated style and
himself comes from an engineering background — he was chief technology officer
of a Silicon Valley software company — makes his message all the more worrying.
His book is my pick for the best business book of the year on technological
disruption.
The degree of robots’ eventual societal penetration and disruption is a
matter of conjecture. By one estimate, almost half of all
U.S. jobs are at risk, and Ford agrees that the scale will be profound. His
favorite verb is vaporize. And he wields it, along with its synonyms, like a taunting
wizard, to describe the destiny for whole segments of white-collar employment.
Robots are a death warrant for any job whose core requirement is experience or
judgment. Lawyers, journalists, Wall Street analysts? Vaporized. Evaporated.
Disappeared. The same goes for pharmacists, radiologists, even computer
programmers — Ford quotes a 2013 study that asserts that the number of U.S.
engineering and computer science graduates that year exceeded available jobs by
50 percent.
In embracing the grim side of the debate, Ford tilts against the
economic orthodoxy. Most analysts believe that technological breakthroughs,
although they destroy some jobs, end up creating far more by spurring new
industries and platforms. The theory of creative destruction explains why
cars displaced buggy-whip manufacturers — and why far more positions were
created in auto plants, gas stations, auto rental firms, and body shops.
Ford flatly calls the theory outdated. This time really is different, he
says, namely because of technology. Although many people worry that artificial
intelligence may surpass the human mind, the more telling development is that
computers are becoming much better at performing predictable tasks. Although we
don’t think of white-collar work this way (especially those of us who are
white-collar workers), many professions are reducible to small, repeatable
components. And that makes them vulnerable to obsolescence. A paradox of the
information age, Ford writes, is that “as work becomes ever more specialized,
it may, in many cases, also become more susceptible to automation.”
Ford mischievously reprints a few paragraphs of sparkling prose about a
Dodgers–Angels game (the Dodgers won 7–6), complete with a colorful quote from
right fielder Vladimir Guerrero. But we learn this piece was “written” by
StatsMonkey, a software program created by students at Northwestern University,
who went on to start a Chicago company called Narrative Science. The company’s
cofounder forecasts that by 2026, 90 percent of news articles will be written
by machines.
It gets worse. It’s a myth, Ford writes, that computers can perform only
as they are programmed. A technique called genetic programming, reflecting
evolution and mutation, can create music, write programs, and even “think”
outside the box. One huge player to keep an eye on: IBM’s Watson, which
famously defeated Jeopardy! champion Ken Jennings in 2011.
Since that triumph, IBM has doubled Watson’s capabilities. Next, Ford argues,
robots equipped with virtual reality technology will start vaporizing
face-to-face jobs (for example, those of university professors and
administrators, and even white-collar managers).
In The Innovators (on which more is written below),
Walter Isaacson cheerfully argues that all will end well because we humans will collaborate with the
machines threatening our jobs. But Ford ridicules a prime bit of evidence of
that claim — that joint human-and-computer chess teams are beating solo
machines, and if they can work together, anyone can. First, Ford thinks that
such chess team superiority will be short-lived, as computers are eventually
bound to trounce traitorous machines collaborating with humans. Second, he
argues, such shows of human–machine chess competition are theater — most
companies are interested in much more prosaic uses for computers, such as
navigating millions of legal records for big cases, a thankless (but very
expensive) task traditionally handled by brand-new law school graduates.
Healthcare may in part be an exception to the coming professional
bloodbath. Smart machines will be able to rapidly assess hundreds of thousands
of medical cases and histories in order to diagnose a case, but it will require
a technician to operate those machines. And considering the growing population
of retired baby boomers, doctors will be in higher demand than ever before. But
we could soon be seeing robots donning white coats. Earlier this year, after
Ford’s book was published, IBM said it had “been
giving Watson eyes,” making it able to examine CT scans, X-rays, and
mammograms, and cross-reference the results with patient records to emerge with
a solid diagnosis.
What is society to do in a jobless future? Like many others, Ford
advocates a guaranteed national salary for every adult. To traverse the
politics, this move could be labeled a “dividend,” the same term that Alaska
uses for the annual oil profits sent to every resident of the state. It would
be designed not to discourage work; some people would tend to be laggards, but
only those who would be so under any other system. Those who were naturally
more productive would continue trying to find a place for themselves.
A Group Effort
Ford describes a future in which technology dominates humanity. In The
Innovators: How a Group of Hackers, Geniuses, and Geeks Invented the Digital
Revolution, Isaacson — author of a best-selling 2011 biography of Steve
Jobs — looks to the past and describes how humanity reached this juncture. His
narrative of the information revolution starts with Charles Babbage, the
19th-century inventor of the Difference Engine (the first whack at a computer),
picks up with the creation of the transistor at Bell Labs in 1948, and winds up
at Google.
Along the way, the author says his intention is to dispel the belief
that big invention is mostly the province of sole inventors. He wanted to show
that technological disruption is actually a team sport. “Only in storybooks do
inventions come like a thunderbolt, or a lightbulb popping out of the head of a
lone individual in a basement or garret or garage,” he writes.
Isaacson, president of the Aspen Institute and a biographer of lone
geniuses such as Ben Franklin and Alfred Einstein, only partly succeeds. This
is because he ends up arguing against himself. Some of the biggest leaps of the
information age may not have been made by a single person. But a lot were
created in pairs or extremely small groups that effectively made them “lone” —
the great breakthroughs were built on the work of others who came before, but
in the end did not involve casts of thousands.
Notwithstanding his unnecessary diversion into the lone inventor theory,
the book is fast-paced and compulsive reading. Isaacson is a remarkably fluent
writer. We’ve heard it elsewhere, but the story of the megalomaniac Bell Labs
physicist William Shockley remains one of the most breathtaking incidents of
personal vanity in U.S. biography. At Bell, John Bardeen and Walter Brattain
collaboratively built the first transistor, which sent Shockley (their
supervisor) nearly out of his mind with envy. For months, Shockley worked
feverishly — yes, all by himself — to produce a better approach. He became so
unmanageable that, finally, just to mollify him, Bell agreed that any photo of
Bardeen and Brattain would include Shockley. For the most famous portrait of
the three, Shockley elbowed his way into Brattain’s office chair, and sat there
like a mandarin, with the others looking on. The tragedy comes later in Silicon
Valley, where Shockley lured a group of Bell men to make semiconductors.
Shockley’s pathologies, including intense paranoia and a drive to take all the
credit, drove them away. The outcome was that his protégés founded Fairchild
Semiconductor — the inventor, along with Texas Instruments, of the microchip —
and then Intel. Shockley himself vanished into relative obscurity.
Among the most bracing facts in the account of the microchip are these:
The first prototype of the microchip cost US$1,000 in 1959; by 1968, the cost
was $2. The same went for devices containing the microchips. The first
blockbuster TI desk calculator was $150 in 1967. In 1975, it cost $25; by 2014,
Walmart was selling one for $3.67. (This data merits a Post-it Note on the
keyboards of those who loudly criticize today’s expensive battery and electric
car technology.)
Isaacson argues, as have others, that the alienation of the 1960s was a
primary cultural factor leading parades of young people to electronics. Amid
these stories, we get the shining core of the book, a long chapter on software
in which Isaacson builds on his brilliant prior telling of the creation stories
of Bill Gates and Paul Allen at Microsoft, and Jobs and Steve Wozniak at Apple.
This chapter alone is worth the price of the book.
Currency Events
Isaacson’s narrative does not get to bitcoin, but a libertarian streak
also lies at the heart of the computer-generated money that’s the latest
technology mania. In The Age of Cryptocurrency: How Bitcoin and Digital
Money Are Challenging the Global Economic Order, Paul Vigna and Michael J.
Casey provide a much-needed account that finally explains something that, to me
at least, has been a mystery: What precisely is bitcoin, and who on earth is
Satoshi Nakamoto?
Vigna and Casey, both veteran reporters at the Wall Street
Journal, take us into the world of young, tech-minded crypto-anarchists
“repulsed by the excesses and abuses of the financial system.” Their people are
angry about “intermediaries” who get rich by allowing people to spend their own
money — an activity that ought to be free.
Who are the culprits behind these excesses and intermediations? Credit
card companies and Wall Street investment banks, which charge transaction fees
that may seem small but that add up to hundreds of billions of dollars in
profits, and 0.5 to 1.5 percent of the GDP of many countries.
So it is that these angry folks on the margins glom onto Satoshi
Nakamoto, an anonymous figure who one day in 2008 posts an announcement of a
fail-safe cryptocurrency that can’t be hacked or abused. Nakamoto vanishes as
mysteriously as he surfaces, but his invention — bitcoin — survives, leaving an
irresistible creation myth for his growing followers.
Simply put, bitcoin is a way for strangers to buy stuff outside the
usual economy. After reading this book, I am convinced of the sincerity of
bitcoin fans, among whom the authors clearly count themselves — Vigna and Casey
are not dispassionate observers. (Last summer, Casey left his position at theJournal to
become a senior advisor at MIT’s Digital Currency Initiative.) But I’m not
fully persuaded of the need for, or the imminent triumph of, bitcoin.
The authors bring no less an intellectual figure than Larry Summers —
former Harvard president, former Treasury secretary, personification of the
establishment — to their defense. Summers says that those who fail to grasp the
torrent on the horizon “are on the wrong side of history.” The bitcoin concept
may naturally sound “as outlandish to the modern mind as the idea of
self-governance must have been to many in 1776,” Summers says. But the world
has changed. Get used to it.
Yet Summers’s take seems exaggerated and, in some spots, the authors
themselves can sound a bit unhinged. They ridicule “the seductive idea that
every dollar printed is an interest-free loan flowing from the people to the
state,” and lament that “controlling the nation’s money has allowed governments
to control the apparatus of power.” Come again?
Disruptive technologies don’t have to vanquish incumbents in order to be
significant. Often, they provide the greatest service by pushing existing firms
to adapt and improve. On August 17, I received an email from a company called
TransferWise that offered to shift money abroad for a 0.5 percent fee. Among
TransferWise’s investors, it said, are Virgin America founder Richard Branson
and venture capitalist Peter Thiel. In Kenya, millions of users of M-Pesa — the
country’s cheap and wildly popular mobile payments network — can already send
money electronically on their cell phones. A revolution in which bitcoin
replaces the dollar, euro, and yen as a unit of exchange seems improbable. It
is likely, however, that we’ll see credit card companies forced to lower their
3 percent transaction fees to a rate closer to 1 percent.
At the end of this engaging and vigorously reasoned book, the authors
argue for a middle ground, in which bitcoin is part of the mix but neither the
anarchists nor the traditional system wins. Ten years from now, when we go to
get a checkup from our robot doctors, we may be able to pay in either dollars
or digital currency.
http://www.strategy-business.com/article/00376
No comments:
Post a Comment