Five Overlooked Principles
Shaping the Destiny of Your Business
The most important changes are often the least obvious. That’s
especially true in business, where changes are taking place on a greater scale
than ever before. The advent of digital technology has brought a number of
these dynamics to the forefront. They can be thought of as principles. Like
Moore’s law or Murphy’s law, they explain the way the world works.
If you’re in business, an understanding of these five principles
is crucial, because digital disruption is quickly becoming the new normal. Many
growth strategies that may have worked well in the past no longer pack a punch.
The principles help explain not just what will happen to your company next, but
why.
Turing’s theory of
computability: Machines can calculate any of the ever-growing number of problems
that are possible to calculate.
In the
1930s and 1940s, the English mathematician Alan Turing (whose life was dramatized in the 2014
thriller The Imitation Game), made some pathbreaking observations
about computability and its ramifications. He identified what he called
“computable” activities: any task that a theoretical machine (in this case, a
mathematical model with a process similar to a computer) can address. Having
determined that computability can be identified
mathematically, Turing then postulated that machines have the capacity to
perform computable tasks as well as human beings can. In his e-book Introduction to Computing:
Explorations in Language, Logic, and Machines, University of Virginia professor David Evans devotes a
chapter to computability. “A problem is computable if it can be solved by some
algorithm,” he explains. “A problem that is noncomputable cannot be
solved by any algorithm.”
In 1950, Alan Turing
provided a demonstration of computability that is still remembered today as the
“Turing test.” He set up an experiment in which he asked his subjects to
exchange typed messages with an entity in another room. Many onlookers could
not guess correctly whether there was a person or a computer generating the
responses. This demonstrated that a computer program could make a convincing
representation of human intelligence, good enough to function as well as a
person on that task. Or, as Turing put it in a
paper (pdf) at the time,
the program could “play the imitation game satisfactorily” for any computable
task. He predicted that in 50 years machines would, for five minutes, be able
to fool a human questioner 30 percent of the time. His prediction came true in
2014.
At the time Turing published his original paper, only a small
number of activities were computable. But he foresaw that as digital computing
continued to evolve, the number of computable activities would grow — and,
indeed, they continue to grow in number and influence today. Thus, for an
ever-growing body of human endeavor, machines are as competent as people, or even
more so. Moreover, no one can say for sure which activities will be susceptible
to Turing’s theory next, nor which will be immune.
It would be hard to
exaggerate the significance of Turing’s work on computability and artificial
intelligence. It launched the computer age, establishing the fact that
everything digital, from the first computer to the cell phones that are
ubiquitous today, has the potential to overtake work currently done by people.
When highly valued digital technologies emerge, such as search engines,
ride-hailing apps, automated teller machines, travel-booking websites, and many
others, they inevitably displace human effort — and in the process, they
fundamentally change their industries.
The impact of increasing computability explains why established
companies are so strongly affected by digitization. In a world where anything
businesses do might soon be done by a computer, they have to continually
redefine themselves along digital lines — whether it’s Walmart acquiring
Internet retailer Jet, GE making its industrial products “smart,” or John Deere
developing robotic lawn mowers. Among the tasks expected to become computable
before long are basic due diligence in M&A transactions, real-time
language-to-language translation (such as English to Chinese), some forms of
programming, and most of the world’s motor vehicle driving.
At the same time, worries (or hopes) that computers will replace
human beings entirely are probably premature. Though many aspects of human
activity can be carved out by the wave of increasing computability, much of
what people do is beyond the capacity of machines to replicate. Turing himself
argued that some tasks would never be computable. Driving and performing due
diligence may be the province of machines before long, but only humans can
decide where to travel and what companies to buy.
Coase theorem of
transaction costs: Aside from transaction costs, the most efficient outcome
will prevail in any market.
The critical phrase in this
theorem is aside from transaction costs. The theorem means that the
investment of money, time, and attention devoted to the exchange of goods and
services will determine how well your company competes. Or, put another way,
the only companies with growth potential are those that keep their internal
transaction costs (their own expenses) lower than their external transaction
costs (the expense of doing business with others).
Starting in the late 1930s, economist Ronald Coase considered at
length the question of why firms come into being and why they decline. (He
received the Nobel Prize in Economics in 1991, in part for this work.) His
ultimate conclusion has become known as the Coase theorem: a company becomes
viable when it’s able to perform its activities in-house more cheaply than it
could by outsourcing them to the market. If that weren’t the case, there
wouldn’t be companies at all; commerce would operate solely through the
markets, with people continually forming and reforming themselves into
project-based enterprises.
Companies exist, in short, because when you try to do business
alone, without being part of a larger enterprise, you incur many different
kinds of external transaction costs. The expenses associated with gathering and
sharing information, maintaining infrastructure, bargaining with others,
managing activity, and enforcing compliance are all reduced when you don’t have
to negotiate a new transaction every time you start something new. Larger firms
have an advantage because they can handle more of these activities internally than
smaller firms can.
But companies also have internal costs that the market doesn’t
carry. Operating at a large scale requires layers of management and internal
oversight performed by employees who would not be needed in a smaller firm.
There are also costs involved with internal politics; for example, when a
project fails because others disapprove of it, the funding spent to develop it
is forfeited. This pattern disproportionately affects large companies, where
there tend to be more internal bureaucratic constraints. Moreover,
internal costs associated with staff (such as healthcare and training) can soar
in a large company, but would be externalized if everyone were a contractor.
The Coase theorem explains the curbs on growth that confound many
companies, large and small. A company can grow only as long as its internal
costs, including all overhead, are lower than its external costs. Once internal
costs equal or surpass external costs and the company has reached the point of
diminishing returns, it will stop expanding. Because internal transaction costs
are often difficult to spot, the leaders of the company may not realize why
it’s struggling.
New digital technologies have raised the pressures associated with
the Coase theorem because they reduce external costs. Some observers believe
that these technologies will sound the death knell for large companies. Search
engines, for instance, have made it much easier and cheaper to obtain
information today than it was in the past, and thus have eroded an advantage
formerly held by large companies with great resources.
But digitization also reduces internal costs — at least for
companies that leverage the technology effectively. This has changed the
economics of internal organization. Take, for example, cloud-based services,
which coordinate capacity and functionality more easily and rapidly, allowing
for higher levels of productivity with the same staff and for temporary
expansion when needed without large capital expenditures. The Web services
units of Amazon, Microsoft, and Google have capitalized on this trend, and such
services will likely proliferate. Some companies are already building
technological platforms linking all parts of their enterprise and value chain
for relatively low cost compared with the IT projects of yesteryear. They are
thus becoming much more competitive than they were before.
As digitization continues, transaction costs will continue to
decline. This will affect decisions about which activities to keep inside an
organization and which to acquire from the outside. Some things that used to be
cheaper on the inside will now become more expensive — for example, the
maintenance of R&D staffs. Hence the value of open innovation. Meanwhile,
things that used to benefit from outsourcing, such as HR and training, may now
become less expensive internally, because the de-layering of hierarchies may
allow more informal (and therefore less expensive) talent management and
recruiting. The only thing that won’t change is the basic equation: The lower
your internal costs compared with your external costs, the more likely your
company is to grow.
Bell’s law on the birth and
death of computer classes: Roughly every decade, a new class of lower-priced
computing devices emerges — and changes everything.
Gordon Bell is among the
most celebrated engineers in the computer industry. He designed most of the
influential PDP series of minicomputers at Digital Equipment Corporation; he
conceived and cofounded the first museum of computer history, in Boston; and
more recently, his experiments with “life-logging” — automatically putting the
images, sounds, and documents of everyday experience into digital archival
storage — have deepened our understanding of what computers can do.
Bell is also a great
thinker. In his seminal 1972 article (pdf), he observed that every decade or so,
advances in semiconductors, storage, interfaces, and networks lead to the
development of a new, lower-priced computer “class,” and a new industry and
marketplace along with it. This tends to (at least partially) supplant the old
class; smartphones displaced many personal computers, which displaced
minicomputers, which displaced mainframes. Of course, mainframes still survive
as supercomputers and cloud servers, but the new classes of computers continue
to redefine possibilities. Bell’s law suggests that technology evolves through
punctuated equilibrium, repeatedly thrusting the world into a new normal.
Every time a shift in computer classes takes place, the impact
goes far beyond technology. New platforms, new forms of programming, and new
types of network interfaces appear. In business, a distinct new industry
emerges, often with completely different companies in the lead. The old
companies either adapt (as Apple did with the smartphone and Microsoft did with
the cloud) or decline (like Digital Equipment Corporation, Gateway, Kaypro,
Osborne, and many others).
The newest class is exemplified by the Internet of Things (IoT).
It involves miniature computers, small enough to embed throughout industrial
society and powerful enough to track, analyze, and communicate. This
class, which has just appeared, is already fueling enormous growth. Banking,
automobiles, defense, healthcare, and security are a few of the industries
using the newest devices to innovate products and services. The opportunities
that embedded computing can offer will become clearer as the IoT evolves, but
there’s no doubt that this new class will be as transformative as its
predecessors.
Baldwin and Clark’s concept
of modularity: Breaking a technology or process into functionally relevant
components facilitates innovation.
In the late 1990s, Harvard
Business School professors Carliss Baldwin and Kim
Clark noted
that modularity, also known as modular design, is a key
driver of innovation speed. Modularity is a technique used in software
development, automobile design, and other aspects of engineering. It breaks a
complex technological project into a number of functionally relevant components
— standardized where standardization is called for, and individually designed
where differentiation is needed.
In a system with high modularity, the standardized parts, or
modules, can easily be swapped out, upgraded, and adapted in different ways for
different systems. Modularity makes it easier (and less expensive) to manage
the complexity of a design. For example, some parts of a modular computer
system can be opened up to outside innovators (who might design apps and other
accessories for it) while other parts are kept proprietary and free of outside
interference. A truly modular system can be tailored to individual customers
without the entire design needing to be reinvented.
Baldwin and Clark’s study of industrial design found that the more
modular a product is, the faster it can be innovated. Computers, for example,
all have several standardized processors, mainboards, and hard drives. They
also have USB ports that make it easy to upgrade peripherals such as
headphones, storage, and flash drives without having to upgrade the computer
itself. These interoperable components have not slowed down innovation; on the
contrary, because computer makers still compete intensively, component
development has helped focus R&D on features where differentiation matters.
The concept of modularity applies to companies as well. Relatively
modular organizations can innovate more quickly than others, because the pace
of their research and development is not slowed down by the slowest unit or
product. Consider Amazon, which has developed an architecture of services that
allows it to offer a complex array of offerings at lower cost points than its
competitors. This is a key reason the Internet retailer has proved so adept at
entering new industries and developing new offerings — including its own
modular infrastructure, the Amazon Cloud.
Modularity in an organization does not require setting up separate
R&D labs in complete isolation from each other. It does, however, require
paying attention to the design of the R&D process, sharing some common
processes and practices (such as, perhaps, procurement of materials and the use
of cloud-based software platforms) while keeping other practices fully
individual (such as the unique hardware and software features that competitors
should not reverse engineer). Today, as companies make the move from analog to
digital, modularity is crucial because speed and agility are crucial.
Nakamoto’s law of the
distributed ledger: Transactions improve when trust is managed by the system,
not by mediators.
No one
has yet called it “Nakamoto’s law,” but the principle underlying blockchain technology is already fundamental to the
verification and certification practices of the future. In a technical paper (pdf) published in 2008, Satoshi Nakamoto (a
pseudonym for an individual whose real identity remains undisclosed) observed that the
Internet commerce of that time relied on financial institutions as a trusted
third party to process electronic payments. This was unnecessarily complex and
was vulnerable to deceit or failure. He posited that if two parties could
electronically transact with one another without the need for an external
overseer, online transactions would be easier and cheaper.
A few months later, Nakamoto released the first version of
bitcoin: a digital currency with a distributed ledger, a peer-to-peer data
technology that ensures verification through a software process called
blockchain. Blockchain uses a digital distributed ledger to record
transactions electronically, linking each new entry by code to the entry that
came before. Verification takes place through network coordination. Computers
all through the network contribute to bitcoin’s computation, storing the ledger
and exchanging the codes that create new bitcoins. This distributed approach
ensures that all the posted transactions are legitimate. No single member or
group can compromise the integrity of a ledger distributed among so many
participants.
Bitcoin and the distributed
ledger have the potential to completely change the economics of trading,
banking, voting, oversight, and other validated transactions. The number and
size of transactions that can be cleared automatically,
without institutional approval, are likely to rise dramatically. These
transactions will be the basis for “smart
contracts”: automated, legally binding, and self-enforcing
transactional agreements. As trust-related risks (including reputational risks)
become a thing of the past, companies will become more dynamic. They will be
free to take new kinds of risk. Imagine buying real estate as easily as buying
groceries, knowing that the transaction is safe because computers around the
world, all linked together, provide the bona fides of all parties involved.
Nakamoto’s law is a natural extension of the other four principles
described here. As Turing would have predicted, it digitizes verification,
which had always been assumed to be a human domain. It represents a Coase-like
reduction in transaction costs, which may reduce many of the costs associated
with financial services. It is possible because of the rise of a new computer
class, the interlinked servers and devices of today’s cloud-based software (and
encryption). And it demonstrates the effectiveness of modularity. Although the
core of the technology remains hidden, the bitcoins themselves fit together as
interoperable blocks, each one derived from the code of the previous one.
The Power of the Five Principles
Taken together, these five principles have enormous economic
ramifications. Although it’s impossible to predict the full extent of their
impact, it’s clear they will change the structure and source of profit in a
variety of industries around the world. Already, the shifts in computability,
computer classes, transaction-cost dynamics, and modularity have enabled
companies such as Amazon and Google to reach untold numbers of vendors and
customers without racking up crushing operating expenses.
These principles also help us better understand the ways in which
winning companies compete in today’s rapidly digitizing market. At the end of
the day, great strategy depends on understanding the fundamentals of
innovation, economics, and marketing, which are changing in terms of value
creation. Power will flow to enterprises that embrace automation, reduce
internal costs, make better use of advanced devices, design modularity into
their products and services, and participate in blockchain-style verification
systems. To be sure, the optimal approach will vary from one industry to
another: A human third party may be enough to guarantee the security of
transactions in some markets, whereas other markets will require virtual
third-party validation through a blockchain system.
As industries continue to embrace full digitization, we will see a
shift in the way experts talk about competition and growth. As with Moore’s
law, the basic trends are readily apparent. But the way they play out, and the
details of industry evolution that follow, will be full of surprises.
https://www.strategy-business.com/article/Five-Overlooked-Principles-Shaping-the-Destiny-of-Your-Business?gko=c5d7a&utm_source=itw&utm_medium=20170419&utm_campaign=resp
No comments:
Post a Comment