BRN Discussion Ongoing

If Van neu man system have bottleneck in limited computations, then in memory processing have even bigger bottlenecks and that is limited memory.
How is processing going to evolve beyond a certain point with limited memory?
With neurophonic the memory is not only limited with ultra ultra limited.
Dyor
 

Attachments

  • Screenshot_20250921_084629_LinkedIn.jpg
    Screenshot_20250921_084629_LinkedIn.jpg
    280.1 KB · Views: 130
  • Like
  • Fire
  • Love
Reactions: 8 users

rgupta

Regular
So you mean software can handle hard ware bottlenecks??
Yes to an extent but to grow, hardware have to be improved as well.
To develop better softwares you need a lot of head counts.
On the other hand if the model sizes can be reduced, that can improve performance of Van neu man architect as well, as proven by deep seek.
Dyor
 
  • Like
Reactions: 1 users

Labsy

Regular
So you mean software can handle hard ware bottlenecks??
Yes to an extent but to grow, hardware have to be improved as well.
To develop better softwares you need a lot of head counts.
On the other hand if the model sizes can be reduced, that can improve performance of Van neu man architect as well, as proven by deep seek.
Dyor
Look buddy,
I don't think the SP will get any lower. You might as well by back in now rather the risk missing the boat.
 
  • Haha
  • Like
Reactions: 12 users

manny100

Top 20
Neuromorphic Chips: Brain-Inspired AI Is Coming Faster Than You Think | InvestorPlace

"The Final Word on the Silent Revolution That Could Outperform Nvidia​

Neuromorphic computing isn’t just the next chip upgrade; it’s a radical leap forward. These brain-inspired systems promise to make machines smarter, faster, and far more energy-efficient.

If they deliver, they won’t just improve AI… they’ll redefine it.

And like every breakthrough before it, the biggest gains go to those who get in before the crowd catches on.

This is the kind of opportunity that could turn small-cap pioneers into market leaders – and supercharge the incumbents building tomorrow’s AI infrastructure.

It may be early days for neuromorphic computing, but it’s no longer theoretical.

The seeds are planted. The architecture is real. And the use cases are arriving fast.

In fact, there’s one we’re particularly bullish on… humanoid robotics.

Humanoid robotics is no longer sci-fi … it’s the next frontier of AI. These machines require real-time, low-power intelligence that only neuromorphic and next-gen AI chips can deliver. As adoption accelerates, the companies building this tech could anchor a trillion-dollar disruption.

That’s why we’re zeroed in on this space.

Today’s breakthroughs in robotics are possible because AI chips are finally fast enough to simulate robots entirely in virtual space: testing millions of actions digitally before a single move happens in the real world.

Imagine a robot learning to sew by practicing millions of stitches overnight, without ever touching fabric.

This shift is reaching a tipping point, revealing the kind of opportunity that could turn small-cap pioneers into market leaders. And we’re got our sights set on one little-known firm ready to step into the limelight."
 
  • Like
  • Love
Reactions: 12 users

HopalongPetrovski

I'm Spartacus!
The lovely 'Anastasi in Tech' presents an interesting 30 minutes on the criticality of eco systems.
Worth a look on a Sunday arvo.

 
  • Like
  • Fire
  • Love
Reactions: 5 users

rgupta

Regular
Look buddy,
I don't think the SP will get any lower. You might as well by back in now rather the risk missing the boat.
Thanks Labsy, i actually never left the boat.
Cheers
 
  • Like
Reactions: 2 users

Labsy

Regular
  • Haha
Reactions: 2 users

7für7

Top 20
Everyone keeps talking about getting on the boat or leaving it…
Meanwhile I’m already sitting alone in the rocket.
No wonder it hasn’t launched yet — it’s still waiting for the hillbillies taking their sweet time paddling a boat to the launch pad.

Come on boys and girls, hurry up! 🚀


Rocket Launch GIF
 
  • Haha
  • Like
Reactions: 5 users

rgupta

Regular
Why always so pessimistic then? Are you superstitious?
Sorry it is not pessimistic but realistic. To attract more public they use some sophisticated words, but understanding the true meaning for the same will let you know what the actual issue is.
Like van neu man bottleneck, and this word is used thousands of times, that akida is out of that. But if you think what is a the bottleneck and what is the akida bottleneck, you will realise akida have bigger bottle neck than van neuman architect.
Realising your limitations is not pessimistic but a sign of strength. Akida knows that and that is why they are trying with the edge devices, where memory is not required that much. But to compete with van neu architect and GPUs with that big limitation is a big task. That is why brainchip does not consider NVIDIA as a competitor but many posters start comparing akida with GPUs which to an extent is no match.
Yes akida can make GPUs 10 times faster if it can be intergrated into a gpu as event based decision making but still no work or no idea is proposed as yet.
I like akida as an accelerator where the utilisation of technology with existing ones is more beneficial than stand alone.
Just my view. And always happy to be corrected
 
  • Like
Reactions: 1 users
Sorry it is not pessimistic but realistic. To attract more public they use some sophisticated words, but understanding the true meaning for the same will let you know what the actual issue is.
Like van neu man bottleneck, and this word is used thousands of times, that akida is out of that. But if you think what is a the bottleneck and what is the akida bottleneck, you will realise akida have bigger bottle neck than van neuman architect.
Realising your limitations is not pessimistic but a sign of strength. Akida knows that and that is why they are trying with the edge devices, where memory is not required that much. But to compete with van neu architect and GPUs with that big limitation is a big task. That is why brainchip does not consider NVIDIA as a competitor but many posters start comparing akida with GPUs which to an extent is no match.
Yes akida can make GPUs 10 times faster if it can be intergrated into a gpu as event based decision making but still no work or no idea is proposed as yet.
I like akida as an accelerator where the utilisation of technology with existing ones is more beneficial than stand alone.
Just my view. And always happy to be corrected
Nun of that made any sence
 
  • Like
Reactions: 2 users
Top Bottom