Whilst I don't believe we are with Qualcomm as yet, in my personal layman opinion, if we are to be in Snapdragon at any point, I suspect it would be in the sensing hub.
This from their site for a start.
In this post, we are talking about another kind of AI. It is small and quiet, securely running in the background 24/7 to help you with everyday tasks.
www.qualcomm.com
AI comes in different shapes and forms.
For example, there’s AI that runs in the cloud — using large, deep neural networks — that lets you categorize your entire photo album in seconds.
Then there’s AI that runs only on your smartphone, keeping all the calculation and personal data on the phone while making your social media posts and videos look great.
In this post, we are talking about another kind of AI. It is small and quiet, securely running in the background 24/7 to help you with everyday tasks. Last year at our annual Snapdragon Tech Summit we introduced our
2nd Generation Qualcomm Sensing Hub with a dedicated always-on, low-power AI processor that is uniquely designed to run some tiny, yet powerful AI use cases.
Thanks to this new, dedicated AI processor – now powerful enough to run small AI neural networks on your smartphone – we are seeing mind-blowing 5X AI performance improvement over our previous-generation Sensing Hub. Because of this performance gain,
we took the neural networks that usually run on the Qualcomm Hexagon Processor and ported it to the Sensing Hub. Now we can offload up to 80 percent processing from our Hexagon Processor* — all at less than 1 milliamp (mA) of power consumption.
The sensing hub scope seems right up Akidas function alley & I presume where Prophesee will possibly link in too given the comments on camera function below.
Learn about the advantages of AI on-device, such as improved user privacy and enhanced application performance. Unlock the power of mobile AI today.
www.qualcomm.com
Qualcomm Sensing Hub
The latest generation of Sensing hub has more horsepower with an additional
dedicated AI processor to support audio, sensing and camera-based experiences, becoming the most powerful ultra-low-power AI architecture in the industry.
I also read in some earlier papers about the Zeroth which we all know about.
Back in 2016 was the following article which, as we always know in hindsight, gives a good outline of where they were heading.
Neuromorphic was also the catch word back then for them and gotta wonder that if they can get their hands on something pretty much COTS and more advanced, that can readily slip into their product offering, then why wouldn't you look for a POC at least. Also coupled with the Prophesee relationship as we know.
SNN better than CNN RNN etc as we all believe & if AKD1500 has the CNN2SNN as indicated then the transition may be easier to slip into the sensing hub and run off the Hexagon processor?
Machine learning is no longer a technology confined to supercomputers and massive AI engines – it is coming to the edge of the network and even to the handset, to power new user experiences and make complex decisions close to the user. Google’s work with Movidius, and IBM’s TrueNorth ‘brain...
rethinkresearch.biz
Machine learning is no longer a technology confined to supercomputers and massive AI engines – it is coming to the edge of the network and even to the handset, to power new user experiences and make complex decisions close to the user.
Google’s work with Movidius, and IBM’s TrueNorth ‘brain chips’ and device-based deployments of Watson, are examples, but perhaps the most ambitious effort comes from Qualcomm with its Zeroth machine learning platform, which can be embedded in the Snapdragon system-on-chip (SoC).
Initially, Qualcomm’s Zeroth/Snapdragon work focused on bringing AI algorithms and vision processing to cars, but many other applications are in its sights, as highlighted by its new deep learning software developers’ kit (SDK) for the Snapdragon 820 SoC.
Called the Snapdragon Neural Processing Engine, and unveiled at the recent Embedded Vision Summit, the SDK is an attempt by Qualcomm to expand beyond its core smartphone markets –
by enticing automotive, industrial, and IoT developers to its Snapdragon platform. It also sees the chip supplier looking for broader market reach, and accelerated uptake, by enabling a great deal of its brain-like functionality in software.
Back in May 2014, when Qualcomm first introduced the Zeroth project, the goal was to build a neural processing unit (NPU) that would be a hardware module for its advanced SoCs. But dedicated hardware not only adds to cost and power consumption for mobile devices, but raises barriers to developer support.
So the new SDK for the neuromorphic Zeroth Machine Intelligence Platform – Qualcomm’s equivalent of IBM’s TrueNorth – aims to bring the hardware-specific functions of these brain chips to a more general purpose Snapdragon platform. This will help mobile devices to leverage the potential of silicon that acts in a manner somewhat akin to the human brain.
While IBM is planning on shipping its brain chips to end-customers, who will put them to use in servers,
the Qualcomm approach relies on software to bring those capabilities to a far more adaptable chipset – one that can be used in mobile devices, industrial IoT endpoints and vehicles.
Machine learning is central to the evolution of the IoT. There simply aren’t enough engineers to monitor and manage the billions of devices that are expected to be deployed.
There aren’t enough developer resources to teach image recognition to camera systems, or collision avoidance to autonomous cars, and so the machines have to be able to evolve from a base-level intelligence installed in the factory, and adapt to the dynamic world around them.
If a connected traffic camera had to poll a real human being every time it spotted a car licence plate it didn’t recognize, entire systems would crawl to a halt. From the edge devices to the centralized data center deployments, machine learning enables vast libraries of data to be independently analyzed at speeds far in excess of what a human could manage – in theory, creating value from data that is just lying around, waiting to be used.
Qualcomm says that the SDK will allow OEMs to run their neural network models on the Snapdragon 820, which will allow devices like smartphones, drones, and automobiles, to carry out image recognition, object tracking, gesture and facial recognition, and natural language processing.
The new SDK’s flagship features include an
accelerated runtime for the on-device execution of the convolutional and recurrent neural networks, configured in software to get the most out of the 820’s cores – the 64-bit Kyro quad-core CPU, Adreno 530 GPU, and the Hexagon 680 DSP.
A core part of the offering is the Snapdragon’s ability to move its workloads between the different cores, in order to achieve an optimal load or efficiency. This heterogeneous compute power should allow for more capable end devices, and support for Caffe and CudaConvNet is also included, which will be appreciated by developers who have used those familiar environments.
“The demand for untethered, mobile learning-driven user experiences is increasing rapidly and Qualcomm Technologies’ customers have been clamoring for tools to help them realize their product visions,” said Gary Brotman, director of product management at Qualcomm Technologies. “With the introduction of the new Snapdragon Neural Processing Engine SDK, we are making it possible for myriad sectors, including mobile, IoT and automotive to harnesses the power of Qualcomm Snapdragon 820 and make high-performance, power efficient on-device deep learning a reality.”
The Zeroth software package is currently used in Qualcomm’s Snapdragon Scene Detect, for spotting objects or faces in a camera view and adjusting the settings accordingly, and its Smart Protect malware detection, which looks for anomalous behavior in the phone that might be a sign of a security compromise.
Qualcomm has referred to this offering as one of the first examples of the emerging age of “conscious computing”. Last fall, Raj Talluri, SVP of product management in mobile computing, said in an interview: “We’ve done a lot of work getting cameras and computer vision optimized in the phone space. Now we’re bringing that technology into this space where the application is a little different, but the technology we built applies perfectly.” The result, he said, is “a camera conscious of what’s happening in the scene”.
That consciousness is achieved with software to perform on-camera video analytics such as object detection, facial detection and recognition and multi-object tracking, harnessing Zeroth to classify different objects.
By bringing the intelligence to the local site, Qualcomm believes the cameras will be able to go a step further than current cloud-based systems – for instance, not just distinguishing residents of the home from strangers, but being able to ignore ‘safe’ incidents, like a car driving past outside, and not waste time and processing power by uploading images of them.
This reveals the Qualcomm agenda – to shift the edge of the network into the device, and to make the mobile gadgets, in which its chips are so strong, the seat of artificial intelligence, rather than the centralized machine learning platforms where Intel is the dominant silicon provider and the smarts are controlled by IBM or Google.
Zeroth is key to this agenda, but the Snapdragon camera indicates the same directions.
More prosaically, it also opens up new addressable markets for Qualcomm processors, amid intense pressures in the smartphone market. Talluri said: “We’re tapping into our immense portfolio to
build custom platforms for key segments within the IoT, and connected cameras are one of these areas that can benefit.”