The Scientists Chasing Brain-like Neuromorphic Computing
7 Min Read
Ronni Shendar
June 13, 2023
The worldās
fastest supercomputer requires 21 million watts of power. Our brains, in comparison, hum along on a mere 20 wattsāroughly the energy needed for a light bulb.
For decades, engineers have been fascinated by how our brains compute. Sure, computers will outperform us in their capacity for mathematical calculations. But they struggle with tasks the human brain seems to handle effortlessly.
Why is that?
Computing like the brain
Justin Kinney, a neuroscientist, bioengineer, and technologist in Western Digitalās R&D, explained, āNo one understands how the brain works, really.ā
Kinney would know. He has spent much of his career trying to unravel the brainās secrets. Engineer turned neuroscientist, Kinney hoped to join the neuroscientist community, learn what they already knew, and then apply it to computing.
āMuch to my dismay, I found neuroscientists donāt actually understand the brain. No one does. And thatās because thereās little data,ā he said.
The brain is regarded as one of the most complex known structures in the universe. It has billions of neurons, trillions of connections, and multiple levels ranging from cellular to molecular and synaptic. But the biggest challenge is that the brain is difficult to access.
āThe brain is encased in a thick bone,ā said Kinney, āand if you try to access, poke, or prod it, it will get really upset and hemorrhage, and delicate neurons will die.ā
Nevertheless, Kinney said progress is being made on various fronts, particularly in the field of recording brain activity, which is good news for those trying to build brain-like computers.
āWhat weāve learned is that there are similarities in computing principles when it comes to how neurons communicate and how we use electronics and circuits to do functional tasks and manipulate digital information,ā said Kinney.
āUltimately, weād like to build next-generation computing hardware utilizing all the brainās tricks for efficient computing, memory, and storage.ā
Neuromorphic computing
Dr. Jason Eshraghian is an assistant professor at the Department of Electrical and Computer Engineering at the University of California, Santa Cruz (UCSC) and leads the universityās Neuromorphic Computing Group.
Neuromorphic computing is an emerging field focusing on designing electronic circuits, algorithms, and systems inspired by the brainās neural structure and its mechanisms for processing information.
Eshraghian emphasizes that his goal isnāt about replicating biological intelligence, though. āMy goal isnāt to copy the brain,ā he said. āMy goal is to be useful. Iām trying to find whatās useful about the brain, and what we understand sufficiently to map into a circuit.ā
One area that has been a particular focus for Eshraghian is the spiking mechanism of neurons. Unlike the constant activity of AI models like ChatGPT, the brainās neurons are usually pretty quiet. They only fire when there is something worth firing about.
Eshraghian asked, āHow many times have you asked ChatGPT to translate something into Farsi or Turkish? Thereās a huge chunk of ChatGPT that I personally will never tap into, and so itās kind of like saying, well, why do I want that? Why should that be active? Maybe instead, we can home in on the part of the circuit that matters and let that activate for a brief instant in time.ā
On his path toward brain-like computing, Eshraghian embraces another trick of the brain: the dimension of timeāor the temporal dimension. āThereās a lot of argument about how the brain takes analog information from the world around us, converts it to spikes, and passes it to the brain,ā he said. āTemporal seems to be the dominant mechanism, meaning that information is stored in the timing of a single spikeāwhether something is quicker or slower.ā
Eshraghian believes that taking advantage of the temporal dimension will have profound implications, especially for semiconductor chips. He argues that, eventually, weāll exhaust the possibilities of 3D vertical scaling. āThen what else do you do?ā he asked. āWhat I believe is that then you have to go to the fourth dimension. And that is time.ā
Brain-like hardware
Building on spiking and temporal mechanisms, Eshraghian and his team have developed
SpikeGPT, the largest spiking neural network for language generation. The neural network impressively consumes 22 times less energy than other large deep learning language models. But Eshraghian emphasizes that new circuits and hardware will be vital to unlocking its full potential.
āWhat defines the software of the brain?ā he asked. āThe answer is the physical substrate of the brain itself. The neural code is the neural hardware. And if we manage to mimic that concept and build computing hardware that perfectly describes the software processes, weāll be able to run AI models with far less power and at far lower costs.ā
Since the dawn of the information age, most computers have been built on the von Neumann architecture. In this model, memory and the CPU are separated, so data is constantly cycling between the processor and memory, expending energy and time.
But thatās not how the brain works. Brains are an amazingly efficient device because neurons have both the memory and the calculation in the same place.
Now a class of emerging memoriesāResistive RAM, magnetic memories like MRAM, and even memories made of ceramicāare showing potential for this type of neuromorphic computing by having the basic multiplications and additions executed in the memory itself.
The idea isnāt farfetched. Recent collaborations, such as Western Digitalās collaboration with the U.S. National Institute of Standards and Technology (NIST), have successfully demonstrated the potential of these technologies in
curbing AIās power problem.
Engineers hope that in the future, they could use the incredible density of memory technology to store 100 billion AI parameters in a single die, or a single SSD, and perform calculations in the memory itself. If successful, this model would catapult AI out of massive, energy-thirsty data centers into the palm of our hands.
Better than the brain
Neuromorphic computing is an ambitious goal. While the industry has more than 70 years of experience computing hard digital numbers through CPUs, memories are a different beastāmessy, soft, analog, and noisy. But advancements in circuit design, algorithms, and architectures, like those brought about by Western Digital engineers and scientists, are showing progress thatās moving far beyond research alone.
For Dr. Eshraghian, establishing the Neuromorphic Computing Group at UCSC is indicative of the fieldās shift from exploratory to practical pursuits, pushing the boundaries of what is possible.
āEven though we say that brains are the golden standard and the perfect blueprint of intelligence, circuit designers arenāt necessarily subject to the same constraints of the brain,ā said Eshraghian. āProcessors can cycle at gigahertz clock rates, but our brain would melt if neurons were firing that fast. So, there is a lot of scope to just blow straight past what the brain can do.ā
Kinney at Western Digital concurs. āWe theorize that some of the details of brains may be artifacts of evolution and the fact that the brain has to build itself. Whereas the systems that we engineer, they donāt have that constraint yet,ā he said.
Kinney hopes that by exploring computing functions through materials we can accessāsilicon, metal, even brain organoids in a dishāwe may coincidentally uncover what happens in the brain.
āI believe the question of power efficiency will help us unlock the brainās secrets, so letās go chase that question,ā he said.
āHow does the brain do so much with so little?ā