Here's an extract from an article published in Forbes on the 8 October 2021, written by tech analyst Bob O'Donnell.
Firstly, the answer to the title of this article is "yes" and secondly, can someone call Bob and let him know about AKIDA?
Is Neuromorphic Computing The Answer For Autonomous Driving And Personal Robotics?
(Extract)
What’s needed is a type of computing that can really think and learn on its own and then adapt its learning to those unexpected scenarios. As crazy and potentially controversial as that may sound, that’s essentially what researchers in the field of neuromorphic computing are attempting to do. The basic idea is to replicate the structure and function of the most adaptable computing/thinking device we know of—the human brain—in digital form. Following the principles of basic biology, neuromorphic chips attempt to re-create a series of connected neurons using digital synapses that send electrical pulses between them, much as biological brains do.
It’s an area of academic research that’s been around for a few decades now, but only recently has it started to make real progress and gain more attention. In fact, buried in the wave of tech industry announcements that have been made over the last few weeks was news that Intel had released the second generation of its neuromorphic chip, named Loihi 2, along with a new open-source software framework for it that they’ve dubbed Lava.
To put realistic expectations around all of this, Loihi 2 is not going to be made commercially available—it’s termed a research chip—and the latest version offers 1 million neurons, a far cry from the approximately 100 billion found in a human brain. Still, it’s an extremely impressive, ambitious project that offers 10x the performance, 15x the density of its 2018-era predecessor (it’s built on the company’s new Intel 4 chip manufacturing process technology), and improved energy efficiency. In addition, it also provides better (and easier) means of interconnecting its unique architecture with other more traditional chips.
Intel clearly learned a great deal from the first Loihi, and one of the biggest realizations was that software development for this radically new architecture is extremely hard. As a result, another essential part of the company’s news was the debut of Lava, an open-source software framework and set of tools that can be used to write applications for Loihi. The company is also offering tools that can simulate its operation on traditional CPUs and GPUs so that developers can create code without having access to the chips.
What’s particularly fascinating about how neuromorphic chips operate is that, despite the fact they function in a dramatically different fashion from both traditional CPU computing and parallel GPU-like computing models, they can be used to achieve some of the same goals. In other words, neuromorphic chips like Loihi 2 can provide the desired outcomes that traditional AI is shooting for, but in a significantly faster, more energy efficient, and less data intensive way. Through a series of event-based spikes that occur asynchronously and trigger digital neurons to respond in various ways—much as a human brain operates (vs. the synchronous, structured processing in CPUs and GPUs)—a neuromorphic chip can essentially “learn” things on the fly. As a result, it’s ideally suited for devices that must react to new stimuli in real-time.
These capabilities are why these chips are so appealing to those designing and building robots and robotic-like systems, which autonomous driving cars essentially are. Bottom line is that it could take commercially available neuromorphic chips to power the kind of autonomous cars and personal robots of our science fiction-inspired dreams.
Of course, neuromorphic computing isn’t the only new approach to advancing the world of technology. There’s also a great deal of work being done in the more widely discussed world of quantum computing. Like quantum computing, the inner workings of neuromorphic computing are extraordinarily complex and, for now, primarily seen as research projects for corporate R&D labs and academic research. Unlike quantum, however, neuromorphic computing doesn’t require the extreme physical challenges (temperatures near absolute zero) and power requirements that quantum currently does. In fact, one of the many appealing aspects of neuromorphic architectures is that they’re designed to be extremely low power, making them suitable for a variety of mobile or other battery-powered applications (like autonomous cars and robots).
Despite recent advancements, it’s important to remember that commercial application of neuromorphic chips is still several years away. However, it’s hard not to get excited and intrigued by a technology that has the potential to make AI-powered devices truly intelligent, instead of simply very well-trained. The distinction may seem subtle, but ultimately, it’s that kind of new smarts that we’ll likely need in order to make some of the “next big things” really happen in a way that we can all appreciate and imagine.
Applications like autonomous driving and personal robotics are proving to be challenging for traditional computing architectures, but a relatively unknown, biology-inspired technology called neuromorphic computing could hold the key to new killer applications.
www.forbes.com