BRN Discussion Ongoing

Tothemoon24

Top 20
44958686-E4C8-4890-AE7D-DE54C8006CC0.jpeg
Exciting News! 🌟 The Future of AI Hardware is Here! šŸ¤–šŸ’”

As we dive deeper into the fascinating world of Artificial Intelligence (AI), it's crucial to keep an eye on the ever-evolving landscape of AI hardware. šŸ•µļøā€ā™€ļøšŸ’»

šŸ”® What lies ahead for AI hardware? Let's explore some captivating trends:

1ļøāƒ£ Specialized AI chips: Brace yourselves for a wave of specialized chips designed exclusively for AI tasks! These futuristic chips, like ASICs and GPUs, will offer unparalleled computational power and energy efficiency, propelling AI capabilities to new heights. āš”šŸ”‹

2ļøāƒ£ Quantum Computing: Hold onto your hats, folks! Quantum computers are on the horizon, promising mind-boggling speeds for solving complex problems. šŸ§ šŸš€ Quantum computing's potential impact on AI is immense, with the power to revolutionize our understanding and application of AI algorithms.

3ļøāƒ£ Edge Computing: Say hello to the AI of the future, right at your fingertips! šŸŒšŸ“² Edge computing will witness a surge in AI-enabled devices like sensors and autonomous systems, benefiting from localized AI processing. Reduced latency and enhanced privacy will be the name of the game. šŸ¢šŸ 

4ļøāƒ£ Neuromorphic Computing: Inspired by the wonders of the human brain, neuromorphic computing aims to mimic neural networks' incredible parallelism and efficiency. These cutting-edge chips will expedite AI training and inference while conserving energy. It's like AI evolving to match our own cognitive abilities! šŸ§ āš™ļø

5ļøāƒ£ Cloud Infrastructure Advancements: The AI cloud is getting bigger, better, and bolder! šŸŒ©ļøāš™ļø Cloud infrastructure will continue to evolve, with powerful GPUs, FPGAs, and other accelerators fueling the AI revolution. Get ready for efficient, scalable, and lightning-fast AI processing at your fingertips! šŸ’Ŗā˜ļø

The future of AI hardware holds tremendous promise, and we're just scratching the surface. 🌟✨ It's a thrilling time to be part of this remarkable journey, where technology pushes boundaries and reshapes the possibilities of what AI can achieve.
 

Attachments

  • BDB0AC2B-1792-4D00-A5A6-33DFA31177BD.jpeg
    BDB0AC2B-1792-4D00-A5A6-33DFA31177BD.jpeg
    1,017.6 KB · Views: 121
  • Like
  • Fire
Reactions: 16 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 8 users
  • Like
  • Fire
Reactions: 3 users

Tothemoon24

Top 20

The Scientists Chasing Brain-like Neuromorphic Computing​

7 Min Read

Ronni Shendar
June 13, 2023
The Scientists Chasing Brain-like Neuromorphic Computing

The world’s fastest supercomputer requires 21 million watts of power. Our brains, in comparison, hum along on a mere 20 watts—roughly the energy needed for a light bulb.
For decades, engineers have been fascinated by how our brains compute. Sure, computers will outperform us in their capacity for mathematical calculations. But they struggle with tasks the human brain seems to handle effortlessly.
Why is that?

Computing like the brain

Justin Kinney, a neuroscientist, bioengineer, and technologist in Western Digital’s R&D, explained, ā€œNo one understands how the brain works, really.ā€
Kinney would know. He has spent much of his career trying to unravel the brain’s secrets. Engineer turned neuroscientist, Kinney hoped to join the neuroscientist community, learn what they already knew, and then apply it to computing.
ā€œMuch to my dismay, I found neuroscientists don’t actually understand the brain. No one does. And that’s because there’s little data,ā€ he said.
The brain is regarded as one of the most complex known structures in the universe. It has billions of neurons, trillions of connections, and multiple levels ranging from cellular to molecular and synaptic. But the biggest challenge is that the brain is difficult to access.
ā€œThe brain is encased in a thick bone,ā€ said Kinney, ā€œand if you try to access, poke, or prod it, it will get really upset and hemorrhage, and delicate neurons will die.ā€
Nevertheless, Kinney said progress is being made on various fronts, particularly in the field of recording brain activity, which is good news for those trying to build brain-like computers.
ā€œWhat we’ve learned is that there are similarities in computing principles when it comes to how neurons communicate and how we use electronics and circuits to do functional tasks and manipulate digital information,ā€ said Kinney.
ā€œUltimately, we’d like to build next-generation computing hardware utilizing all the brain’s tricks for efficient computing, memory, and storage.ā€
illustration of scientists inside a forest of neurons

Neuromorphic computing

Dr. Jason Eshraghian is an assistant professor at the Department of Electrical and Computer Engineering at the University of California, Santa Cruz (UCSC) and leads the university’s Neuromorphic Computing Group.
Neuromorphic computing is an emerging field focusing on designing electronic circuits, algorithms, and systems inspired by the brain’s neural structure and its mechanisms for processing information.
Eshraghian emphasizes that his goal isn’t about replicating biological intelligence, though. ā€œMy goal isn’t to copy the brain,ā€ he said. ā€œMy goal is to be useful. I’m trying to find what’s useful about the brain, and what we understand sufficiently to map into a circuit.ā€
One area that has been a particular focus for Eshraghian is the spiking mechanism of neurons. Unlike the constant activity of AI models like ChatGPT, the brain’s neurons are usually pretty quiet. They only fire when there is something worth firing about.
Eshraghian asked, ā€œHow many times have you asked ChatGPT to translate something into Farsi or Turkish? There’s a huge chunk of ChatGPT that I personally will never tap into, and so it’s kind of like saying, well, why do I want that? Why should that be active? Maybe instead, we can home in on the part of the circuit that matters and let that activate for a brief instant in time.ā€
On his path toward brain-like computing, Eshraghian embraces another trick of the brain: the dimension of time—or the temporal dimension. ā€œThere’s a lot of argument about how the brain takes analog information from the world around us, converts it to spikes, and passes it to the brain,ā€ he said. ā€œTemporal seems to be the dominant mechanism, meaning that information is stored in the timing of a single spike—whether something is quicker or slower.ā€
Eshraghian believes that taking advantage of the temporal dimension will have profound implications, especially for semiconductor chips. He argues that, eventually, we’ll exhaust the possibilities of 3D vertical scaling. ā€œThen what else do you do?ā€ he asked. ā€œWhat I believe is that then you have to go to the fourth dimension. And that is time.ā€

Brain-like hardware

Building on spiking and temporal mechanisms, Eshraghian and his team have developed SpikeGPT, the largest spiking neural network for language generation. The neural network impressively consumes 22 times less energy than other large deep learning language models. But Eshraghian emphasizes that new circuits and hardware will be vital to unlocking its full potential.
ā€œWhat defines the software of the brain?ā€ he asked. ā€œThe answer is the physical substrate of the brain itself. The neural code is the neural hardware. And if we manage to mimic that concept and build computing hardware that perfectly describes the software processes, we’ll be able to run AI models with far less power and at far lower costs.ā€
Since the dawn of the information age, most computers have been built on the von Neumann architecture. In this model, memory and the CPU are separated, so data is constantly cycling between the processor and memory, expending energy and time.
But that’s not how the brain works. Brains are an amazingly efficient device because neurons have both the memory and the calculation in the same place.
Illustration of a screen showing a brain scan supposedly a neuromorphic computing display with 1s and 0s and random rulers and numbers as decoration

Now a class of emerging memories—Resistive RAM, magnetic memories like MRAM, and even memories made of ceramic—are showing potential for this type of neuromorphic computing by having the basic multiplications and additions executed in the memory itself.
The idea isn’t farfetched. Recent collaborations, such as Western Digital’s collaboration with the U.S. National Institute of Standards and Technology (NIST), have successfully demonstrated the potential of these technologies in curbing AI’s power problem.
Engineers hope that in the future, they could use the incredible density of memory technology to store 100 billion AI parameters in a single die, or a single SSD, and perform calculations in the memory itself. If successful, this model would catapult AI out of massive, energy-thirsty data centers into the palm of our hands.

Better than the brain

Neuromorphic computing is an ambitious goal. While the industry has more than 70 years of experience computing hard digital numbers through CPUs, memories are a different beast—messy, soft, analog, and noisy. But advancements in circuit design, algorithms, and architectures, like those brought about by Western Digital engineers and scientists, are showing progress that’s moving far beyond research alone.
For Dr. Eshraghian, establishing the Neuromorphic Computing Group at UCSC is indicative of the field’s shift from exploratory to practical pursuits, pushing the boundaries of what is possible.
ā€œEven though we say that brains are the golden standard and the perfect blueprint of intelligence, circuit designers aren’t necessarily subject to the same constraints of the brain,ā€ said Eshraghian. ā€œProcessors can cycle at gigahertz clock rates, but our brain would melt if neurons were firing that fast. So, there is a lot of scope to just blow straight past what the brain can do.ā€
Kinney at Western Digital concurs. ā€œWe theorize that some of the details of brains may be artifacts of evolution and the fact that the brain has to build itself. Whereas the systems that we engineer, they don’t have that constraint yet,ā€ he said.
Kinney hopes that by exploring computing functions through materials we can access—silicon, metal, even brain organoids in a dish—we may coincidentally uncover what happens in the brain.
ā€œI believe the question of power efficiency will help us unlock the brain’s secrets, so let’s go chase that question,ā€ he said.
ā€œHow does the brain do so much with so little?ā€
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Frangipani

Top 20
My carrots are bigger than yours! šŸ„•

Your arm muscle, however, is sorely missing an ā€˜s’! Your musculus biceps brachii to be precise… šŸ˜‰

2BC2CF14-46FC-4DE3-A04E-D7804508B961.jpeg


Although it may look like like a plural, the correct singular form is actually ā€œbicepsā€.

 
  • Haha
  • Like
Reactions: 7 users

Boab

I wish I could paint like Vincent

The Scientists Chasing Brain-like Neuromorphic Computing​

7 Min Read

Ronni Shendar
June 13, 2023
The Scientists Chasing Brain-like Neuromorphic Computing

The world’s fastest supercomputer requires 21 million watts of power. Our brains, in comparison, hum along on a mere 20 watts—roughly the energy needed for a light bulb.
For decades, engineers have been fascinated by how our brains compute. Sure, computers will outperform us in their capacity for mathematical calculations. But they struggle with tasks the human brain seems to handle effortlessly.
Why is that?

Computing like the brain

Justin Kinney, a neuroscientist, bioengineer, and technologist in Western Digital’s R&D, explained, ā€œNo one understands how the brain works, really.ā€
Kinney would know. He has spent much of his career trying to unravel the brain’s secrets. Engineer turned neuroscientist, Kinney hoped to join the neuroscientist community, learn what they already knew, and then apply it to computing.
ā€œMuch to my dismay, I found neuroscientists don’t actually understand the brain. No one does. And that’s because there’s little data,ā€ he said.
The brain is regarded as one of the most complex known structures in the universe. It has billions of neurons, trillions of connections, and multiple levels ranging from cellular to molecular and synaptic. But the biggest challenge is that the brain is difficult to access.
ā€œThe brain is encased in a thick bone,ā€ said Kinney, ā€œand if you try to access, poke, or prod it, it will get really upset and hemorrhage, and delicate neurons will die.ā€
Nevertheless, Kinney said progress is being made on various fronts, particularly in the field of recording brain activity, which is good news for those trying to build brain-like computers.
ā€œWhat we’ve learned is that there are similarities in computing principles when it comes to how neurons communicate and how we use electronics and circuits to do functional tasks and manipulate digital information,ā€ said Kinney.
ā€œUltimately, we’d like to build next-generation computing hardware utilizing all the brain’s tricks for efficient computing, memory, and storage.ā€
illustration of scientists inside a forest of neurons

Neuromorphic computing

Dr. Jason Eshraghian is an assistant professor at the Department of Electrical and Computer Engineering at the University of California, Santa Cruz (UCSC) and leads the university’s Neuromorphic Computing Group.
Neuromorphic computing is an emerging field focusing on designing electronic circuits, algorithms, and systems inspired by the brain’s neural structure and its mechanisms for processing information.
Eshraghian emphasizes that his goal isn’t about replicating biological intelligence, though. ā€œMy goal isn’t to copy the brain,ā€ he said. ā€œMy goal is to be useful. I’m trying to find what’s useful about the brain, and what we understand sufficiently to map into a circuit.ā€
One area that has been a particular focus for Eshraghian is the spiking mechanism of neurons. Unlike the constant activity of AI models like ChatGPT, the brain’s neurons are usually pretty quiet. They only fire when there is something worth firing about.
Eshraghian asked, ā€œHow many times have you asked ChatGPT to translate something into Farsi or Turkish? There’s a huge chunk of ChatGPT that I personally will never tap into, and so it’s kind of like saying, well, why do I want that? Why should that be active? Maybe instead, we can home in on the part of the circuit that matters and let that activate for a brief instant in time.ā€
On his path toward brain-like computing, Eshraghian embraces another trick of the brain: the dimension of time—or the temporal dimension. ā€œThere’s a lot of argument about how the brain takes analog information from the world around us, converts it to spikes, and passes it to the brain,ā€ he said. ā€œTemporal seems to be the dominant mechanism, meaning that information is stored in the timing of a single spike—whether something is quicker or slower.ā€
Eshraghian believes that taking advantage of the temporal dimension will have profound implications, especially for semiconductor chips. He argues that, eventually, we’ll exhaust the possibilities of 3D vertical scaling. ā€œThen what else do you do?ā€ he asked. ā€œWhat I believe is that then you have to go to the fourth dimension. And that is time.ā€

Brain-like hardware

Building on spiking and temporal mechanisms, Eshraghian and his team have developed SpikeGPT, the largest spiking neural network for language generation. The neural network impressively consumes 22 times less energy than other large deep learning language models. But Eshraghian emphasizes that new circuits and hardware will be vital to unlocking its full potential.
ā€œWhat defines the software of the brain?ā€ he asked. ā€œThe answer is the physical substrate of the brain itself. The neural code is the neural hardware. And if we manage to mimic that concept and build computing hardware that perfectly describes the software processes, we’ll be able to run AI models with far less power and at far lower costs.ā€
Since the dawn of the information age, most computers have been built on the von Neumann architecture. In this model, memory and the CPU are separated, so data is constantly cycling between the processor and memory, expending energy and time.
But that’s not how the brain works. Brains are an amazingly efficient device because neurons have both the memory and the calculation in the same place.
Illustration of a screen showing a brain scan supposedly a neuromorphic computing display with 1s and 0s and random rulers and numbers as decoration

Now a class of emerging memories—Resistive RAM, magnetic memories like MRAM, and even memories made of ceramic—are showing potential for this type of neuromorphic computing by having the basic multiplications and additions executed in the memory itself.
The idea isn’t farfetched. Recent collaborations, such as Western Digital’s collaboration with the U.S. National Institute of Standards and Technology (NIST), have successfully demonstrated the potential of these technologies in curbing AI’s power problem.
Engineers hope that in the future, they could use the incredible density of memory technology to store 100 billion AI parameters in a single die, or a single SSD, and perform calculations in the memory itself. If successful, this model would catapult AI out of massive, energy-thirsty data centers into the palm of our hands.

Better than the brain

Neuromorphic computing is an ambitious goal. While the industry has more than 70 years of experience computing hard digital numbers through CPUs, memories are a different beast—messy, soft, analog, and noisy. But advancements in circuit design, algorithms, and architectures, like those brought about by Western Digital engineers and scientists, are showing progress that’s moving far beyond research alone.
For Dr. Eshraghian, establishing the Neuromorphic Computing Group at UCSC is indicative of the field’s shift from exploratory to practical pursuits, pushing the boundaries of what is possible.
ā€œEven though we say that brains are the golden standard and the perfect blueprint of intelligence, circuit designers aren’t necessarily subject to the same constraints of the brain,ā€ said Eshraghian. ā€œProcessors can cycle at gigahertz clock rates, but our brain would melt if neurons were firing that fast. So, there is a lot of scope to just blow straight past what the brain can do.ā€
Kinney at Western Digital concurs. ā€œWe theorize that some of the details of brains may be artifacts of evolution and the fact that the brain has to build itself. Whereas the systems that we engineer, they don’t have that constraint yet,ā€ he said.
Kinney hopes that by exploring computing functions through materials we can access—silicon, metal, even brain organoids in a dish—we may coincidentally uncover what happens in the brain.
ā€œI believe the question of power efficiency will help us unlock the brain’s secrets, so let’s go chase that question,ā€ he said.
ā€œHow does the brain do so much with so little?ā€
After reading that I feel even more confident in my investment in BRN.
PVDM and Anil are magicians and while others are talking about it our guys have done it.
 
  • Like
  • Love
  • Fire
Reactions: 30 users

Moonshot

Regular
Hi Moonshot,

Akida 1 in its original concept did not include CNN-to-SNN. It was pure SNN, and it had, and still has native SNN on chip learning.

Can you provide the actual reference which lead to your assumption that Akida 1 does not do native SNN training?
Found it, in the open neuromorphic community chat - run by that fellow Jason Eshgarian
E2462F07-58E6-42D2-9AF9-7378F6BE3AFB.jpeg
 
  • Like
Reactions: 3 users

Moonshot

Regular
Found it, in the open neuromorphic community chat - run by that fellow Jason Eshgarian
View attachment 38578
Think Brainchip needs to improve the way they market to this community, they have been spamming them and they don’t like it…
E1E59F41-F542-4893-BA9A-ABF7F265D1E1.jpeg
 
  • Haha
Reactions: 1 users
Haven’t had time to look into Relativity Space, yet, which could obviously be another/the real reason why Nandan liked this post.

Or maybe he simply did because he is a friendly ā€œlikerā€ just like Rob Telson appears to be…
I read the whole post . Did I miss where Brainchip was mentioned?
 
  • Haha
Reactions: 1 users

Galaxycar

Regular
It would be great if we could have an extraordinary AGM,Few jobs would go and we may find a direction that actual appreciates shareholders,listens,promotes revenue and full disclosure,sick of reading about MOU’s that no one knew about. We are just mushrooms with this lot in charge,sick of it.
 

DK6161

Regular
I never said Akida was involved. It is possible though, is it not? Just thought it was interesting. My, my, we are all a very touchy bunch this morning aren't we? Take your eyes off the sp and have some faith and conviction in your investment decisions either way. dear lord.
So we just going to post anything and say Akida may or maynot be involved?
Ok then, post away. Keep us all excited about things that may or may not contain Akida.
Re: SP. Easier said than done matey.
 
  • Like
Reactions: 4 users

DK6161

Regular
Just saw one of our our employees that I follow on instagram liking Kim Kadarshian's post.
Now I am not saying she must be involved with Akida. But I am not saying she is NOT invovled at all.
To me this is exciting news regardless.
Anyway no need to keep your eyes on her insta page or our SP.
DYOR YMMV
 
  • Haha
  • Like
Reactions: 23 users
Busy on the road today with clients so not checked much but by look of SP probs not a bad thing I guess.

Anyway, just surfing some keywords and hadn't see this paper from mid last year.

Nothing special per se on Akida however has a nice little comparison table that was interesting.

Obviously with 2.0 we now also have a few more strings so to speak.

One author from Intel and another from BMW research.

Original paper HERE


Neuromorphic computing hardware and neural
architectures forrobotics
Yulia Sandamirskaya1
*, Mohsen Kaboli2,3
, Jorg Conradt4
, Tansu Celikel5

Neuromorphic Computing Lab, Intel, Munich, Germany. 2
BMW Group, Department of Research, New Technol-
ogies and Innovation, Munich, Germany. 3
Donders Institute for Brain, Cognition, and Behavior, Radboud University,
Nijmegen, Netherlands. 4
Kungliga Tekniska Hƶgskolan (KTH), School of Electrical Engineering and Computer
Science, Stockholm, Sweden. 5Georgia Institute of Technology, Atlanta, GA, USA.



Neuromorphic hardware enables fast and power-efficient neural network–based artificial intelligence that is well
suited to solving robotic tasks. Neuromorphic algorithms can be further developed following neural computing
principles and neural network architectures inspired by biological neural systems. In this Viewpoint, we provide
an overview of recent insights from neuroscience that could enhance signal processing in artificial neural networks
on chip and unlock innovative applications in robotics and autonomous intelligent systems. These insights
uncover computing principles, primitives, and algorithms on different levels of abstraction and call for more
research into the basis of neural computation and neuronally inspired computing hardware

IMG_20230619_192722.jpg
 
  • Like
  • Fire
Reactions: 39 users

Learning

Learning to the Top šŸ•µā€ā™‚ļø
This may be old, but did anyone noted, there are neuromorphic hardware inside the Mercedes's VISION AVTR? Although it's doesn't mention BrainChip back then.


"The neuro-inspired approach of the VISION AVTR, including so-called neuromorphic hardware, promises to minimise the energy requirements of sensors, chips and other components to a few watts."

Screenshot_20230619_212249_Samsung Internet.jpg



The Mercedes VISION AVTR debut January 2020. The VISION EQXX debut January 2022.
What are the chance of Neuromorphic computing inside the VISION One Eleven? šŸ¤”

Learning šŸ–
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

alwaysgreen

Top 20
Just saw one of our our employees that I follow on instagram liking Kim Kadarshian's post.
Now I am not saying she must be involved with Akida. But I am not saying she is NOT invovled at all.
To me this is exciting news regardless.
Anyway no need to keep your eyes on her insta page or our SP.
DYOR YMMV

Akida is carrying out vibration analysis on her butt.
 
  • Haha
  • Like
  • Fire
Reactions: 12 users

robsmark

Regular
  • Haha
  • Like
Reactions: 12 users

schuey

Regular
  • Haha
  • Like
Reactions: 9 users
  • Like
  • Fire
Reactions: 5 users

Diogenese

Top 20
Found it, in the open neuromorphic community chat - run by that fellow Jason Eshgarian
View attachment 38578
Hi Moonshot,

Thanks for digging this out.

I don't know Jason's site.

I think Yen should do what I do with Ikea flatpacks - when all else fails, read the instructions.
 
  • Like
  • Haha
Reactions: 4 users
Top Bottom