Artificial intelligence: The time to act is now PART 2
3. Companies must have end-to-end solutions to win in AI
To win in AI, companies must offer, or
orchestrate, end-to-end solutions across all nine layers of the technology
stack because many enterprise customers struggle to implement piecemeal
solutions. A hospital, for instance, would prefer to purchase a system that
included both an MRI machine and AI software that makes a diagnosis, rather
than getting these components separately and then trying to make them work
together. In addition to increasing sales, suppliers with end-to-end solutions
can capture a strategic foothold with customers and accelerate adoption.
Nvidia, for instance, offers its Drive PX platform as a module, not just a
chip, to provide an end-to-end solution for autonomous driving. The platform
combines processors, software, cameras, sensors, and other components to
provide real-time images of the environment surrounding a car. It can also
identify its location on a map and plan a safe path forward for vehicles.
Large hardware and software players often
expand their AI portfolio across the stack by acquiring other companies. While
deal making is common across industries, it’s more prevalent within AI because
of the need for end-to-end solutions. There have been over 250 acquisitions
involving private companies with AI expertise since 2012, with 37 of these
occurring in the first quarter of 2017. To compete with these giants, many
start-ups are undertaking partnerships to position themselves as system
integrators for AI solutions.
4. In the AI technology stack, most value will come from
solutions or hardware
Within the AI technology stack, our
analysis of future trends suggests that each layer will directly generate a
different amount of profit, or value. Most value will be concentrated in two
areas . First—and somewhat surprisingly, given industry trends—many of the best
opportunities will come from hardware (head nodes, inference accelerators, and
training accelerators). Together, we estimate that these components will
account for 40 to 50 percent of total value to AI vendors.
While hardware has become commoditized in
many other sectors, this trend won’t reach AI any time soon because hardware
optimized to solve each microvertical’s problems
will provide higher performance, when total cost of ownership is considered,
than commodity hardware, such as general-purpose central processing units
(CPUs). For instance, accelerators optimized for convolutional neural networks
are best for image recognition and thus would be chosen by medical-device
manufacturers. But accelerators optimized for long short-term memory networks
are better suited to speech recognition and language translation and thus would
appeal to makers of sophisticated virtual home assistants. With every use case
having slightly different requirements, each one will need partially customized
hardware.
In another pattern that departs from the
norm, software (defined as the platform and interface layers) is unlikely to be
the sole long-term differentiator in AI. As seen with the advent of DL
accelerators, hardware alone or in combination with software will likely enable
significant performance improvements, such as decreased latency or power
consumption. In this environment, players will need to be selective about
hardware choices.
Another 40 to 50 percent of the value from
AI solutions will come from services, which includes solutions and use cases.
System integrators, who often have direct access to customers, will capture
most of these gains by bringing solutions together across all layers of the
stack.
For the immediate future, other areas of
the AI stack won’t generate much profit, even though they may generate indirect
value that will drive growth in the DL ecosystem. For instance, data and
methods, both elements of training, now deliver only up to 10 percent of a
typical AI supplier’s value. This pattern occurs because most data comes from
end users of AI solutions, rather than third-party providers. A market for data
may eventually emerge in the consumer and enterprise world, however, making
this layer of the stack relatively more attractive in the future.
5. Specific hardware architectures will be critical
differentiators for both cloud and edge computing
With the growth of AI, hardware is
fashionable again, after years in which software drew the most corporate and
investor interest. Our discussions with end users suggest that interest will be
strong for both cloud and edge solutions, depending on the use case. Cloud will
continue to be the favored option for many applications, given its scale
advantage. Within cloud hardware, customers and suppliers vary in their
preference for application-specific integrated circuit (ASIC) technology over
graphics processing units (GPUs), and the market is likely to
remain fragmented.
That said, we also see an important and
growing role for inference at the edge, where low latency or privacy concerns
are critical, or when connectivity is problematic. At the edge, ASICs will win
in the consumer space because they provide a more optimized user experience,
including lower power consumption and higher processing, for many applications.
Enterprise edge will see healthy competition among field programmable gate
arrays, GPUs, and ASIC technology. However, ASICs may have an advantage because
of their superior performance per watt, which is critical on the edge. We
believe that they could dominate specific enterprise applications when demand
levels are strong enough to justify their high development costs.
6. The market is taking off already—companies need to act now
and reevaluate their existing strategies
Although technology companies may not know
exactly how AI demand is evolving, they recognize the enormous opportunity
within DL and want to capture it. With the technology still evolving, and with
multiple players implementing wildly different strategies, the recipe for
success is still uncertain.
The big players are already making their
moves, with leading businesses going in directions that defy current wisdom. To
consider just one example, Nvidia has increased its R&D expenditures for AI
by about 8 percent annually from 2012 to 2016, when they reached $1.3 billion .
Those costs represent about 27 percent of Nvidia’s total revenue—much higher
than the peer group average of 15 percent—and they show that Nvidia is willing
to take a different path than many semiconductor companies, which are
aggressively cutting R&D expenditures. Nvidia has also taken massive steps
to create an end-to-end product ecosystem focused on its GPUs. The company is
aggressively training developers on the skills needed to make use of GPUs for
DL, funding start-ups that proliferate the use of its GPUs for DL, forming
partnerships to create end-to-end solutions that incorporate its products, and
increasing the number of GPU-driven applications. Other companies that follow
such unconventional strategies could also be rewarded with exceptional returns.
Nvidia’s success shows that tech companies
won’t win in AI by maintaining the status quo. They need to revise their
strategy now and make the big bets needed to develop solid AI offerings. With
so much at stake, companies cannot afford to have a nebulous or tentative plan
for capturing value. So what are their main considerations as they forge ahead?
Our investigation suggests the following emerging ideas on the classic
questions of business strategy:
·
Where to compete. When deciding where to compete companies have to
look at both industries and microverticals. They should select the use cases
that suit their capabilities, give them a competitive advantage, and address an
industry’s most pressing needs, such as fraud detection for credit-card
transactions.
·
How to compete. Companies should be searching now for partners or
acquisitions to build ecosystems around their products. Hardware providers
should go up the stack, while software players should move down to build
turnkey solutions. It’s also time to take a new look at monetization models.
Customers expect AI providers to assume some of the risk during a purchase, and
that could result in some creative pricing options. For instance, a company
might charge the usual price for an MRI machine that also has AI capabilities
and only require additional payment for any images processed using DL.
·
When to compete. High-tech companies are rewarded for
sophisticated, leading-edge solutions, but a focus on perfection may be
detrimental in AI. Early entrants can improve and rapidly gain scale to become
the standard. Companies should focus on strong solutions that allow them to
establish a presence now, rather than striving for perfection. With an early
success under their belt, they can then expand to more speculative
opportunities.
If companies wait two to three years to
establish an AI strategy and place their bets, we believe they are not likely
to regain momentum in this rapidly evolving market. Most businesses know the
value at stake and are willing to forge ahead, but they lack a strong strategy.
The six core beliefs that we’ve outlined here can point them in the right
direction and get them off to a solid start. The key question is which players
will take this direction before the window of opportunity closes.
By Gaurav
Batra, Andrea Queirolo, and Nick Santhanam January 2018
FOR THE FULL ARTICLE WITH EXHIBITS
https://www.mckinsey.com/industries/advanced-electronics/our-insights/artificial-intelligence-the-time-to-act-is-now?cid=other-eml-alt-mip-mck-oth-1801&hlkid=2a9d89c8c3ca4764a09bb204647e16b4&hctky=1627601&hdpid=094bf139-fd76-400b-8896-e6caf6e19d81
No comments:
Post a Comment