Qualcomm’s Boom Highlights AI Shift To The Edge
Contributor
R. Scott Raynovich is the Founder and Chief Analyst at Futuriom.com
May 29, 2024,03:10pm EDT
A year ago, Qualcomm was not a well-embraced tech stock. In fact, as recently as last October, the company’s shares were dabbling with a 52-week low. The long-time maker of mobile technology and holder of valuable intellectual property was mired in a slump, weighed down by slow growth in China, as well as poor PC and smartphone markets.
Fast-forward to now. There’s this magic called AI. In just six months, Qualcomm shares have gone from a 52-week low to an all-time high on the market’s realization that it has the components and technology to play in the AI market, as devices such as smartphones and PCs become key to delivering AI inferencing—where the output of AI models are delivered to customers on devices. This is what many in the technology industries refer to the “edge”—devices connected to infrastructure.
The launch of Qualcomm’s new Snapdragon X series of chips, which targets AI inferencing, has coalesced nicely with turnarounds in the PC and device markets to give Qualcomm this boost. Qualcomm has also made a series of announcements
with key partners such as Microsoft who are adopting its technology for AI processing on consumer devices.
A photo taken on November 23, 2023 shows the logo of the ChatGPT application developed by US ... [+]
Enthusiasm for the AI Edge
The Qualcomm example shows how the business media and Wall St. have started picking up on the idea that AI requirements are perhaps more broad than just delivering large language models (LLMs) and chatbots from the cloud. There’s edge AI, private enterprise AI, and vertical AI as well.
The thirst for computing to fuel AI extends to the billions of devices around the world, ranging from cars to cameras, often referred to as the Internet of Things (IoT). Anything connected to infrastructure or network will need more processing power and connectivity to run AI models.
Qualcomm shares recently hit a new high on enthusiasm for edge AI.
What does this mean about AI infrastructure at large? Our
recent research and discussions with technology builders say the AI infrastrucuture discussion is about to morph. I think that over the next few years we’ll be talking less about LLMs and chatbots and more about vertically focused AI apps and infrastructure—and private AI for enterprise.
Chatbots are an appealing mass market, but they only address one segment—consumer information. The closest analog is the search market, where Google holds between an 80%-90% share, raking in about $80 billion in quarterly revenue. The current market size for search is estimated to be about $400 billion. The enterprise and industrial technology infrastructure markets represent hundreds of billions more.
The AI market will extend well beyond consumer information and chatbots. It also has diverse applications in data analytics, robotics, healthcare, and finance—to only name a few. Many of these more specific vertical markets may not even need LLMs at all but more specific AI technologies that could include small language models (SLMs) or other custom-designed AI processing software. They’ll have to deliver the results—AI inferencing—across myriad hardware platforms ranging from cars to medical devices.
“We have only scratched the surface of AI as it moves out into verticals, private AI, edge, and distributed cloud. There's more to AI than LLMs and SLMs, and vertical/domain-specific models will dominate the new deployments outside of the large cloud players,” Mike Dvorkin, a cofounder and CTO of
cloud networking company Hedgehog, told me in a recent interview. “The opportunity is immense, and it will require new thinking about infrastructure and how it's consumed."
AI To Drive Private AI and Hybrid Infrastructure
If Dvorkin, a former distinguished engineer at Cisco, is right—the AI edge infrastructure market will be gigantic.
This conversation has popped up in more discussions I’ve witnessed recently, where some technologists have estimated the AI market could flip from 80% modeling and 20% inferencing to the reverse. In addition, CIOs I’ve listened to recently have pointed out that the private AI model will be much more useful in specific industries such as healthcare and finance, where enterprise customers may want to own as much of their own data and models as possible.
For this reason, the AI wave will drive more diverse hybrid and multicloud architectures—including private clouds—as the needs for data, analytics and connectivity spread across multiple infrastructures.
“We have a hybrid cloud model,” said George Maddalino, the CTO of Mastercard, at a recent tech event hosted by the Economist in New York. “We have workloads on prem, workloads on hyperscaler. You can see us traversing from a banks datacenter across a hypserscaler cloud to a retailer in the cloud. By default we end up in an environment that's multicloud.”
Nizar Trigui, CTO with GXO Logistics, also pointed to the idea that AI application connectivity to data will be pervasive, for any location.
“Most of us are going through some kind of digital transformation,” said Trigui. "How do we create more value for the customers? We are creating value out of data in 1,000 warehouses around the world, digitally connected.”
The biggest takeaway from Qualcomm’s recent rise is the enthusiasm for AI everywhere—this means processing and inferencing data wherever it lives. This endeavor will not be limited to infrastructure or models owned exclusively by the hyperscalers, it will spread far and wide across enterprise, edge, and IoT.
In just six months, Qualcomm shares have gone from a 52-week low to an all-time high on the market’s realization that it has the components and technology to play in t...
www.forbes.com