Apologies if this has already been posted, just came across it and had to share
What’s So Exciting About Neuromorphic Computing
By Aaryaa Padhyegurjar
February 10, 2022
2
https://telegram.me/share/url?url=https://www.electronicsforu.com/technology-trends/neuromorphic-computing&text=What’s+So+Exciting+About+Neuromorphic+Computing
https://www.facebook.com/sharer.php?u=https://www.electronicsforu.com/technology-trends/neuromorphic-computing
https://www.linkedin.com/shareArticle?mini=true&url=https://www.electronicsforu.com/technology-trends/neuromorphic-computing&title=What’s+So+Exciting+About+Neuromorphic+Computing
https://api.whatsapp.com/send?text=What’s+So+Exciting+About+Neuromorphic+Computing https://www.electronicsforu.com/technology-trends/neuromorphic-computing
https://www.electronicsforu.com/technology-trends/neuromorphic-computing#
The human brain is the most efficient and powerful computer that exists. Even after decades and decades of technological advancements, no computer has managed to beat the brain with respect to efficiency, power consumption, and many other factors.
Will neuromorphic computers be able to do it?
The AKD1000-powered Mini PCIe board (Source: BrainChip Inc.)
The exact sequence of events that take place when we do a particular activity on our computer, or on any other device, completely depends on its inherent architecture. It depends on how the various components of the computer like the processor and memory are structured in the solid state.
Almost all modern computers we use today are based on the Von Neumann architecture, a design first introduced in the late 1940s. There, the processor is responsible for executing instructions and programs, while the memory stores those instructions and programs.
When you think of your body as an embedded device, your brain is the processor as well as the memory. The architecture of our brain is such that there is no distinction between the two.
Since we know for a fact that the human brain is superior to every single computer that exists, doesn’t it make sense to modify computer architecture in a way that it functions more like our brain? This was what many scientists realised in the 1980s, starting with Carver Mead, an American scientist and engineer.
Fast forward to today
Nowadays, almost all companies have dedicated teams working on neuromorphic computing. Groundbreaking research is being done in multiple research organisations and universities. It is safe to say that neuromorphic computing is gaining momentum and will continue to do so as various advancements are being made.
What’s interesting to note is that although this is a specialised field with prerequisites from various topics, including solid-state physics, VLSI, neural networks, and computational neurobiology, undergraduate engineering students are extremely curious about this field.
At IIT Kanpur, Dr Shubham Sahay, Assistant Professor at the Department of Electrical Engineering, introduced a course on neuromorphic computing last year. Despite being a post-graduate level course, he saw great participation from undergrads as well.
“Throughout the course, they were very interactive. The huge B.Tech participation in my course bears testimony to the fact that undergrads are really interested in this topic. I believe that this (neuromorphic computing) could be introduced as one of the core courses in the UG curriculum in the future,” he says.
Getting it commercial
Until recently, neuromorphic computing was a widely used term only in research and not in the commercial arena. However, as of January 18, 2022, BrainChip, a leading provider of ultra-low-power high-performance AI technology, commercialised its AKD1000 AIOT chip. Developers, system integrators, and engineers can now buy AKD1000-powered Mini PCIe boards and leverage them in their applications, especially those requiring on-edge computing, low power consumption, and high-performance AI.
“It’s meant as our entry-level product. We want to proliferate this into as many hands as we can and get people designing in the Akida environment,” says Rob Telson, Vice President of WorldWide Sales at BrainChip. Anil Mankar, Co-founder and Chief Development Officer of BrainChip, explains, “We are enabling system integrators to easily use neuromorphic AI in their applications. In India, if some system integrators want to manufacture the board locally, they can take the bill of materials from us (BrainChip) and manufacture it locally.”
The 5 sensor modalities (Source: BrainChip Inc.)
What’s fascinating about Akida is that it enables sensor nodes to compute without depending on the cloud. Further, BrainChip’s AI technology not only performs audio and video based learning but even focuses on other sensor modalities like taste, vibration, and smell. You can use it to make a sensor that performs wine tasting! Here is a link to their wine tasting demonstration:
Another major event that occurred this year was when Mercedes implemented BrainChip’s Akida technology in its Vision EQXX electric vehicle. This is definitely a big deal since the Akida technology is tried and tested for a smart automotive experience. All features that the Akida provides, including facial recognition, keyword spotting, etc consume extremely low power.
“This is where we get excited. You’ll see a lot of these functionalities in vehicles—recognition of voices, faces, and individuals in the vehicle. This allows the vehicles to have customisation and device personalisation according to the drivers or the passengers as well,” says Telson. These really are exciting times.
Neuromorphic hardware, neural networks, and AI
The process in which neurons work is eerily similar to an electric process. Neurons communicate with each other via synapses. Whenever they receive input, they produce electrical signals called spikes (also called action potentials), and the event is called neuron spiking. When this happens, chemicals called neurotransmitters are released into hundreds of synapses and activate the respective neurons. That’s the reason why this process is super-fast.
Akida MetaTF ML Framework (Source: MetaTF)
Artificial neural networks mimic the logic of the human brain, but on a regular computer. The thing is, regular computers work on the Von Neumann architecture, which is extremely different from the architecture of our brain and is very power-hungry. We may not be able to deploy CMOS logic on the Von Neumann architecture for long. We will eventually reach a threshold to which we can exploit silicon. We are nearing the end of Moore’s Law and there is a need to establish a better computing mechanism. Neuromorphic computing is the solution because neuromorphic hardware realises the structure of the brain in the solid-state.
As we make progress in neuromorphic hardware, we will be able to deploy neural networks on it. Spiking Neural Network (SNN) is a type of artificial neural network that uses time in its model. It transmits information only when triggered—or, in other words, spiked. SNNs used along with neuromorphic chips will transform the way we compute, which is why they are so important for AI.
How to get started with SNNs
Since the entire architecture of neuromorphic AI chips is different, it is only natural to expect the corresponding software framework to be different too. Developers need to be educated on working with SNNs. However, that is not the case with MetaTF, a free software development framework environment that BrainChip launched in April 2021.
“We want to make our environment extremely simple to use. Having people learn a new development environment is not an efficient way to move forward,” says Telson. “We had over 4600 unique users start looking at and playing with MetaTF in 2021. There’s a community out there that wants to learn.”
India and the future of neuromorphic computing
When asked about the scope of neuromorphic computing in India, Dr Sahay mentions, “As of now, the knowledge, dissemination, and expertise in this area is limited to the eminent institutes such as IITs and IISc, but with government initiatives such as India Semiconductor Mission (ISM) and NITI Ayog’s national strategy for artificial intelligence (#AIforall), this field would get a major boost. Also, with respect to opportunities in the industry, several MNCs have memory divisions in India—
Micron, Sandisk (WesternDigital), etc—that develop the memory elements which will be used for neuromorphic computing.” There’s a long way to go, but there is absolutely no lack of potential. More companies would eventually have their neuromorphic teams in India.
BrainChip Inc. is also building its university strategy to make sure students are being educated in this arena. Slowly, the research done in neuromorphic computing is making its way into the commercial world and academia. Someday, we might be able to improve our self-driving cars, create artificial skins and prosthetic limbs that can learn things about their surroundings! Consider your smart devices. All of them are dependent on the internet and the cloud. If equipped with a neuromorphic chip, these devices can compute on their own! This is just the start of the neuromorphic revolution.
The author, Aaryaa Padhyegurjar, is an Industry 4.0 enthusiast with a keen interest in innovation and research.