Found facts:-
BrainChip is the AArdvark of the ARM partnership programme.
https://www.arm.com/partners/catalog/results#f:Industry=[Artificial Intelligence]
I know this has been remarked upon before, but I think it bears repeating. When any single category is selected, BrainChip is at the top of the list in the majority of cases, and this is not an alphabetical phenomenon. This means that any person searching through the ARM Partners for most of those categories will see listed BrainChip first up.
In fact, you have to get down to Edge Gateway to shake BrainChip off.
View attachment 13491
But you want more - scroll down to:
A
rm and Google: tinyML Pioneers
tinyML is a fast-growing field of machine learning (ML) technologies and applications implemented by Arm and Google to enable ultra-low power ML at the edge. See how tinyML can help solve computer vision, audio recognition, speech recognition, and natural language processing challenges.
https://www.arm.com/partners/artificial-intelligence##
Trillions of ML devices!
So using the standard 1% baseline, that's 10's of billions Google/Akida devices.
But are Google contemplating outside partnerships?
View attachment 13497
https://services.google.com/fh/files/misc/ai_adoption_framework_whitepaper.pdf
So if Google were to incorporate Akida IP in their transformational partnerships, we could be in more than 20% of trillions of Google AI/ML devices.
Nice to see ARM and Google are on speaking terms! Wonder if anyone at Google has thought to search ARM's partnership lists for available AI/ML tech?
It really impresses on me the size of the ecosystem developing around tingML, and now neuromorphic computing.
As an example of this, the tinyML organisation that started in 2018/2019 and has attracted many heavyweights over the last few years, with Microsoft being the latest. Brainchip remains a key sponsor, with other sponsors including Tensorflow, Synaptics, Sony, Samsung, Renesas, Qualcomm, Prophesee, Plumerai, Intel, Infineon, Edge Impulse and ARM.
TinyML and Neuromorphic are becoming closely entwined. In fact tinyML are running an event on the 29th September for all members ( can guarantee many from the above organisations will be attending ) called :
tinyML Neuromorphic Engineering Forum
tinyML is a fast-growing initiative around low-power machine-learning technologies for edge devices. The scope of tinyML naturally aligns with the field of neuromorphic engineering, whose purpose is to replicate and exploit the way biological systems sense and process information within constrained resources.
In order to build on these synergies, we are excited to announce the first tinyML Forum on Neuromorphic Engineering. During this event, key experts from academia and industry will introduce the main trends in neuromorphic hardware, algorithms, sensors, systems, and applications.
Among the speakers are our own Anil Mankar, Christoph POsch, CTO of Propesee and Yulia Sandamirskaya from Intel Labs.
Combining Neuromorphic Design Principles with Modern Machine Learning Algorithms
Anil MANKAR, Chief Development Officer, BrainChip
Abstract (English)
Neuromorphic computing takes inspiration from the structure and function of neural systems and seeks to replicate the energy efficiency, tolerance to noise, representational power, and learning plasticity these systems possess. Current machine learning (ML) algorithms, such as convolutional neural networks (CNNs), are capable of state-of-the-art performance in many computer vision applications such as object classification, detection, and segmentation. In this talk, we discuss how our neuromorphic design architecture, Akida, brings these ML algorithms into the neuromorphic computing domain by executing them as spiking neural networks (SNNs). We highlight how hardware design choices such as the event-based computing paradigm, low-bit width precision computation, the co-location of processing and memory, distributed computation, and support for efficient, on-chip learning algorithms enable low-power, high-performance ML execution at the edge. Finally, we discuss how this architecture supports next generation SNN algorithms such as binarized CNNs and algorithms that efficiently utilize temporal information to increase accuracy.
Sensors
Neuromorphic Event-based Vision
Christoph POSCH, CTO, PROPHESEE
Abstract (English)
Neuromorphic Event-based (EB) vision is an emerging paradigm of acquisition and processing of visual information that takes inspiration from the functioning of the human vision system, trying to recreate its visual information acquisition and processing operations on VLSI silicon chips. In contrast to conventional image sensors, EB sensors do not use one common sampling rate (=frame rate) for all pixels, but each pixel defines the timing of its own sampling points in response to its visual input by reacting to changes of the amount of incident light. The highly efficient way of acquiring sparse data, the high temporal resolution and the robustness to uncontrolled lighting conditions are characteristics of the event sensing process that make EB vision attractive for numerous applications in industrial, surveillance, IoT, AR/VR, automotive. This short presentation will give an introduction to EB sensing technology and highlight a few exemplary use cases.
So over the last few years I believe we have transformed in the eyes of the tech savvy investing community from " will neuromorphic computing actually even work from a commercial perspective, and even if i does, surely there are limited applications " to a recognition that tinyMl and neuromorphic compute will become "ubiquitous" in our lives with endless applications.
For Brainchip investors, the addressable market is forming beyond our wildest dreams, now it's a matter of our market share.
From what I can see, Brainchip is front and center, and it's going to acquire a fair piece of the action.
It really is very good to be a shareholder !