This Forbes article (see below) was only published online 13 hours ago and it should be read bearing in mind what our very own Tony Lewis had to say on Linkedin just 2 weeks go.
Yet another reason for arm to seriously consider incorporating our technology into their new chips...
Qualcomm Could Benefit Most From DeepSeek’s New, Smaller AI
Karl Freund
Contributor
Founder and Principal Analyst, Cambrian-AI Research LLC
Follow
Feb 14, 2025,10:10am EST
Qualcomm CEO Cristiano Amon
While the Deep Seek Moment crashed most semiconductor stocks as investors feared lower demand for data center AI chips, these new, smaller AI models are just the ticket for on-device AI. “DeepSeek R1 and other similar models recently demonstrated that AI models are developing faster, becoming smaller, more capable and efficient, and now able to run directly on device,” said Qualcomm CEO Cristiano Amon at the company’s recent earnings call. And within less than a week, DeepSeek R1-distilled models were running on Qualcomm Snapdragon-powered PCs and smartphones. (Qualcomm is a client of Cambrian-AI Research.)
While both Apple and Qualcomm will benefit from these new models, Qualcomm can quickly apply these models beyond smart phones; the company has strong positions in other markets such as automotive, robotics, and VR headsets, as well as the company’s emerging PC business. All these markets will benefit from the new smaller models and the applications built on them.
Apple is famous for its beautiful fully integrated designs, but Qualcomm partners with others to design and build the final product, speeding time to market and enabling broader adoption. For example, Qualcomm Snapdragon chips power both Meta Quest and Rayban headsets, which enjoy over 70% market share.
Major Trends Accelerating On-device AI
Qualcomm and Apple have both been working hard to reduce model size through lower precision math and model optimization techniques such as pruning and sparsity. Now, with distillation, we are seeing step-function improvement in the quality, performance, and efficiency of AI models that can now run on device. And these smaller models do not demand users to compromise.
These new state-of-the-art smaller AI models have superior performance thanks to techniques like model distillation and novel AI network architectures, which simplify the development process without sacrificing quality. These smaller models can outperform larger ones which really only operate in the cloud.
In addition, the size of models continues to decrease rapidly. State-of-the-art quantization and pruning techniques allow developers to reduce the size of models with no material drop in accuracy.
The table below shows that the distilled versions of both the DeepSeek Qwen and Meta Llama models perform as well or better than the larger and more expensive state of the art models from OpenAI and Mistral. The GPQA Diamond benchmark is particularly interesting, as that model involves deep, multi-step reasoning to solve complex queries, which many models find challenging.
The new DeepSeek-R1 shows significantly better results (accuracy) across all math and coding ... [+]
QUALCOMM
So, Do You Really Need On-device AI?
The market skepticism around on-device AI is fading fast. Here is an example use case that Qualcomm has provided. Imagine you are driving along and one of your passengers mentions coffee. An LLM agent hears this and suggests a place along the route where you can stop and grab a cup. Since the local driving LLM and ADAS systems are local, a cloud-based AI cannot perform this task. This is but one example of how agents will transform AI and are especially useful on-device.
Here is a great use case for LLM Agents in a car. Coffee anyone?
QUALCOMM
So, the AI World Isn’t Crashing?
Not in the least. In fact, we would say that these new models are a tipping point for ubiquitous AI. Smaller, more efficient, and accurate AI models are key to helping make AI pervasive and affordable. Consequently, techniques demonstrated by DeepSeek are already being applied by the mainstream AI companies to keep them competitive and avoid the pitfalls of censorship and security that DeepSeek presents.
And Qualcomm is perhaps the biggest winner in this evolution of models towards affordable AI that fits and runs well on the devices that already number in the billions.