BRN Discussion Ongoing

Baneino

Regular
According to my reading they are best fit for different market segments.
I cannot find any peer reviewed papers comparing them.
Innatera does not use the words 'on chip learning' as Brainchip does but talks about real time 'real time intelligence' and 'adaption'. According to CEO Kumar see EET times article "the main limitation of the Innatera fabric is that it is not self learning, Kumar said noting that the neuron types are fixed, chosen for their suitability for a wide range of pattern recognition. While functions cannot be changed, parameters can be, he said."
Interesting the different methods used.
by both.
It would be great to see a peer review comparison.
Until then I am a bit uncertain as to the extent of competition Innatra offer.
The EET times article is a good read
The two companies hardly compete directly because they are targeting different areas of application. While BrainChip aims to deliver the "universal AI brain" for edge computing, Innatera focuses on ultra-specialized sensor solutions with minimal energy consumption.
There is no clear "winner". I believe both can be successful if they play to their strengths:
BrainChip via broad partner networks and more universal architecture.
Innatera through radical energy efficiency and sensor integration in niche markets
 
  • Like
  • Thinking
Reactions: 18 users

manny100

Top 20
I'm not doubting that our technology is superior Manny, but when it comes to the low end applications that Pulsar is aimed at, that doesn't really matter.

They are going for the "low hanging fruit" something BrainChip has never really been focused on.
We've always been looking at the Big End of Town.

We now have the AKIDA E and Pico (although AKIDA 1.0 IP was always available with a minimum of "Nodes" Renesas only licenced 2 or something?)

But that requires more investment and commitment from an OEM (and more time) to design in "tape out" a chip etc, than Innatera is offering, with an OTSC.

Actual performance comparisons, or some extra features such as on chip learning (which probably aren't as necessary for low end applications) don't really mean much at that end, when you are looking at the investment and commitment differences for the OEMs.

I'm heavily invested here and am on BrainChip's and your "side" I'm just being impartial and honest about this.

We don't have any idea of how much commercial progress Innatera is having with Pulsar.
They may be facing as many or more market penetration issues and acceptance than us.
And if that's the case, it may end up being a big mistake for them to be mass producing their chips.
Yep, you are right Innatera appear to be aiming at the low hanging fruit. Probably a much cheaper chip and does the job where 'Brainchip style on chip learning' is not required eg doorbells and basic wearables and IOT.
Agree, we are the obvious choice for serious health, space, DOD and cybersecurity etc. On chip learning 'Brainchip style' is a big part of that.
IMO on chip learning is a big part of the future of AI and we have it.
 
  • Like
  • Love
  • Thinking
Reactions: 16 users

7für7

Top 20
No matter what… yes, competition stimulates business… and this way the technology also finds broader application and acceptance. But it doesn’t make me happy if the competitors manage to offer market-ready products with their customers faster while we’re still talking about test phases, studies, and demos and trials…it’s almost midnight…we’re on the brink…
 
  • Like
  • Sad
Reactions: 7 users

Boab

I wish I could paint like Vincent
Every one of these trades is exactly 15 seconds apart. Nothing to see here. Carry on.
1754266478179.png
 
  • Like
  • Haha
  • Wow
Reactions: 27 users

HopalongPetrovski

I'm Spartacus!
Every one of these trades is exactly 15 seconds apart. Nothing to see here. Carry on.
View attachment 89277
It's that good old .1975 bot back in business. May work in our favour today on what is predicted to be a bloody Monday.
At least the condition of the perpetrator is clear. They are a complete CXXT. 🤣
 
Last edited:
  • Haha
  • Like
  • Fire
Reactions: 18 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2025-08-04 at 10.25.47 am.png






My Question:

Can you please describe the significance of this research in layman's terms and explain what this means for compannies like BrainChip?

ChatGPT's Response:

Sure — here’s a layman’s summary of the significance of this research:




🧠 What It’s About (Simple Terms)​


Scientists and engineers often need to predict and control complex, unpredictable systems — like how a drone balances itself in the wind, or how a robot adjusts to tricky terrain. One powerful mathematical tool for doing this is called Koopman operator theory, which helps them track and forecast how these dynamic systems evolve over time.


Problem? The math and AI models normally used to do this — like traditional neural networks — are heavy, power-hungry, and slow, especially when you want to run them in real time on small devices (like drones, robots, or cars).




⚡ What They Did​


This study shows that spiking neural networks (SNNs) — the kind used in neuromorphic chips that mimic how real brains work — can do the same job with way less power and complexity.


Specifically:


  • They needed 43× fewer multiplications and 4× fewer additions than traditional AI models.
  • They only needed 20% of the input data to make predictions.
  • The system could still analyze, learn, and predict the behavior of complex systems — just much more efficiently.



🚀 Why It Matters​


  1. Smarter, Greener AI at the Edge:
    This could make it possible to run advanced predictive models directly on low-power edge devices — like autonomous vehicles, wearables, or even satellites — without needing a big server or draining batteries.
  2. Huge Power Savings for Real-Time Systems:
    Drones that stabilize themselves, robots that adapt to the environment, or cars that need to make split-second decisions — all could benefit from this kind of low-energy, high-speed computing.
  3. Pushes Neuromorphic AI Forward:
    It proves that brain-inspired computing isn’t just cool — it’s practical, and in many cases better than traditional AI for real-world applications.



🧩 What This Means for Companies Like BrainChip​


  • Validation: It reinforces the value of spiking neural networks and neuromorphic hardware like BrainChip’s Akida.
  • Opportunity: More demand for these kinds of ultra-efficient chips in robotics, autonomous vehicles, aerospace, and IoT.
  • Differentiation: Shows how neuromorphic tech can do things that even large neural networks struggle with — in a faster, leaner, and more power-conscious way.
 
  • Like
  • Love
  • Fire
Reactions: 21 users

Diogenese

Top 20
What Brainchip is selling:

First we have silicon and/or silicon IP for the following:

Akida 1 SoC/IP

Akida 1500 SoC/IP

Pico IP

Akida/TENNs 2 FPGA/IP Developer Platform access

Akida GenAI FPGA in Development

Akida 3 SoC in development

TENNS is a special case where it can is part of Akida 2 SoC, but is also available as an algorithm which seems to be available for use with a CPU?GPU?

TENNs Bing Videos
(TENNS is an) Algorithm (not only to run in our hardware but a very efficient edge-based algorithm that allow you to do LLMs at the edge. SSM ~ S4/S5 (Tod Vierra 20240105)


Then we have the "off-the-shelf models and the POA advanced TENNs models to be run on the on-line Akida 2 FPGA.

Ready to use:

AkidaNet Object Detection/Recognition

AkidaNet Regression (Facial Age prediction)

AkidaNet/KWS

AkidaNet/Point Cloud Classification

AkidaNet/TENN Gesture Recognition (DVS)

AkidaNet/TENN Eye Tracking (90% activation sparsity)



ADVANCED TENNS MODELS – Available to run on Akida 2 FPGA Developer Platform – By Request

AkidaNet/TENN Audio Denoising

AkidaNet/TENN Automatic Speech Recognition (ASR)

AkidaNet/TENN LLM




The advanced TENNs models can only be tested on the on-line Akida 2 FPGA Developer Platform by prior arrangement. I guess this is to control access so the system does not get overloaded.

The use of TENNs with LLMs and with ASR opens the door for spoken LLM queries at the edge.
 
  • Like
  • Fire
  • Love
Reactions: 43 users

Diogenese

Top 20
View attachment 89278





My Question:
Can you please describe the significance of this research in layman's terms and explain what this means for compannies like BrainChip?

ChatGPT's Response:
Sure — here’s a layman’s summary of the significance of this research:




🧠 What It’s About (Simple Terms)​


Scientists and engineers often need to predict and control complex, unpredictable systems — like how a drone balances itself in the wind, or how a robot adjusts to tricky terrain. One powerful mathematical tool for doing this is called Koopman operator theory, which helps them track and forecast how these dynamic systems evolve over time.


Problem? The math and AI models normally used to do this — like traditional neural networks — are heavy, power-hungry, and slow, especially when you want to run them in real time on small devices (like drones, robots, or cars).




⚡ What They Did​


This study shows that spiking neural networks (SNNs) — the kind used in neuromorphic chips that mimic how real brains work — can do the same job with way less power and complexity.


Specifically:


  • They needed 43× fewer multiplications and 4× fewer additions than traditional AI models.
  • They only needed 20% of the input data to make predictions.
  • The system could still analyze, learn, and predict the behavior of complex systems — just much more efficiently.



🚀 Why It Matters​


  1. Smarter, Greener AI at the Edge:
    This could make it possible to run advanced predictive models directly on low-power edge devices — like autonomous vehicles, wearables, or even satellites — without needing a big server or draining batteries.
  2. Huge Power Savings for Real-Time Systems:
    Drones that stabilize themselves, robots that adapt to the environment, or cars that need to make split-second decisions — all could benefit from this kind of low-energy, high-speed computing.
  3. Pushes Neuromorphic AI Forward:
    It proves that brain-inspired computing isn’t just cool — it’s practical, and in many cases better than traditional AI for real-world applications.



🧩 What This Means for Companies Like BrainChip​


  • Validation: It reinforces the value of spiking neural networks and neuromorphic hardware like BrainChip’s Akida.
  • Opportunity: More demand for these kinds of ultra-efficient chips in robotics, autonomous vehicles, aerospace, and IoT.
  • Differentiation: Shows how neuromorphic tech can do things that even large neural networks struggle with — in a faster, leaner, and more power-conscious way.
Hi Bravo,

I see our friend Dylan Muir from Synsense is in there pitching.
 
  • Sad
  • Wow
Reactions: 2 users

ketauk

Emerged
Thats me :) Thanks for sharing Stable Genius!

I am currently lobbying the wife to let me put in a camera too, combined with the gas sensors I would get a lot more useful data.

I does not (yet) use Brainchip Akida but that is on the roadmap, I want to process the data on the edge in the future so that users embaressing toilet data is not hackable. But for now I am collecting data, analysing and building algorithms ontop of models to measure & monitor gut health.

For those wondering why I started this project, I have divertaticulitis (see attached infographic for those who don't know). I'm 54 and that condition started 4 years ago for me; it is really really really unpleasant - so I want an early warning, I want AI to predict a flare up is coming so I can start mitigation procedures (no solids, liquids only). I can avoid the worst of its symptoms. I think of how my business monitors motors/engines with vibration sensors & predicts future maintenance activities - I want the same for my colon.

Still very early stages as I am currently testing the market; would peple want to buy such a product & can I raise $'s from investors to fund the many clinical trials I need to run?

Its actually a really interesting project for me as it is pushing me hard, I know nothing about biology & I am leveraging AI a lot; it is incredibly impressive (I am an engineer).

Right now I am looking for:
- gut health experts: gastroenterologists looking to improve gut health with AI, I need some on my advisory board and to help with clinical trials
- potential investors: rich people & venture capitalists looking to invest in healthtech & AI
- people keen to put in a prototype sensor in thier toilet and help me collect more data from as many individuals as possible

So if you lot know of any and think they might be intersted in this project, tell them about it :) I'd be very appreciative of any support, even if you just register for our newsletter - that helps me convince investors that there is a big market for this product!
 

Attachments

  • PXL_20250228_093853119.jpg
    PXL_20250228_093853119.jpg
    2 MB · Views: 62
  • PXL_20250205_082555800.jpg
    PXL_20250205_082555800.jpg
    2.6 MB · Views: 69
  • The progression of a flare up.png
    The progression of a flare up.png
    1.9 MB · Views: 64
  • ChatGPT Image Mar 30, 2025, 01_58_52 PM.png
    ChatGPT Image Mar 30, 2025, 01_58_52 PM.png
    1.7 MB · Views: 68
  • Love
  • Like
  • Fire
Reactions: 14 users

7für7

Top 20
What Brainchip is selling:

First we have silicon and/or silicon IP for the following:

Akida 1 SoC/IP

Akida 1500 SoC/IP

Pico IP

Akida/TENNs 2 FPGA/IP Developer Platform access

Akida GenAI FPGA in Development

Akida 3 SoC in development

TENNS is a special case where it can is part of Akida 2 SoC, but is also available as an algorithm which seems to be available for use with a CPU?GPU?

TENNs Bing Videos
(TENNS is an) Algorithm (not only to run in our hardware but a very efficient edge-based algorithm that allow you to do LLMs at the edge. SSM ~ S4/S5 (Tod Vierra 20240105)


Then we have the "off-the-shelf models and the POA advanced TENNs models to be run on the on-line Akida 2 FPGA.

Ready to use:

AkidaNet Object Detection/Recognition

AkidaNet Regression (Facial Age prediction)

AkidaNet/KWS

AkidaNet/Point Cloud Classification

AkidaNet/TENN Gesture Recognition (DVS)

AkidaNet/TENN Eye Tracking (90% activation sparsity)



ADVANCED TENNS MODELS – Available to run on Akida 2 FPGA Developer Platform – By Request

AkidaNet/TENN Audio Denoising

AkidaNet/TENN Automatic Speech Recognition (ASR)

AkidaNet/TENN LLM




The advanced TENNs models can only be tested on the on-line Akida 2 FPGA Developer Platform by prior arrangement. I guess this is to control access so the system does not get overloaded.

The use of TENNs with LLMs and with ASR opens the door for spoken LLM queries at the edge.

Hmm… selling? At the moment we are more like borrowing mode I would say…. Would be happy to see we sell that stuff 😂
 
  • Like
  • Sad
  • Haha
Reactions: 4 users

Doz

Regular
Thats me :) Thanks for sharing Stable Genius!

I am currently lobbying the wife to let me put in a camera too, combined with the gas sensors I would get a lot more useful data.

I does not (yet) use Brainchip Akida but that is on the roadmap, I want to process the data on the edge in the future so that users embaressing toilet data is not hackable. But for now I am collecting data, analysing and building algorithms ontop of models to measure & monitor gut health.

For those wondering why I started this project, I have divertaticulitis (see attached infographic for those who don't know). I'm 54 and that condition started 4 years ago for me; it is really really really unpleasant - so I want an early warning, I want AI to predict a flare up is coming so I can start mitigation procedures (no solids, liquids only). I can avoid the worst of its symptoms. I think of how my business monitors motors/engines with vibration sensors & predicts future maintenance activities - I want the same for my colon.

Still very early stages as I am currently testing the market; would peple want to buy such a product & can I raise $'s from investors to fund the many clinical trials I need to run?

Its actually a really interesting project for me as it is pushing me hard, I know nothing about biology & I am leveraging AI a lot; it is incredibly impressive (I am an engineer).

Right now I am looking for:
- gut health experts: gastroenterologists looking to improve gut health with AI, I need some on my advisory board and to help with clinical trials
- potential investors: rich people & venture capitalists looking to invest in healthtech & AI
- people keen to put in a prototype sensor in thier toilet and help me collect more data from as many individuals as possible

So if you lot know of any and think they might be intersted in this project, tell them about it :) I'd be very appreciative of any support, even if you just register for our newsletter - that helps me convince investors that there is a big market for this product!

Ketauk , if you’re looking for -

Right now I am looking for:
- gut health experts: gastroenterologists looking to improve gut health with AI, I need some on my advisory board and to help with clinical trials


Maybe reach out to Professor Barry Marshall , Ex Brainchip Advisory Board member , he knows a thing or two about gut health .
1754271820311.png
 
  • Like
  • Fire
  • Love
Reactions: 16 users
Thats me :) Thanks for sharing Stable Genius!

I am currently lobbying the wife to let me put in a camera too, combined with the gas sensors I would get a lot more useful data.

I does not (yet) use Brainchip Akida but that is on the roadmap, I want to process the data on the edge in the future so that users embaressing toilet data is not hackable. But for now I am collecting data, analysing and building algorithms ontop of models to measure & monitor gut health.

For those wondering why I started this project, I have divertaticulitis (see attached infographic for those who don't know). I'm 54 and that condition started 4 years ago for me; it is really really really unpleasant - so I want an early warning, I want AI to predict a flare up is coming so I can start mitigation procedures (no solids, liquids only). I can avoid the worst of its symptoms. I think of how my business monitors motors/engines with vibration sensors & predicts future maintenance activities - I want the same for my colon.

Still very early stages as I am currently testing the market; would peple want to buy such a product & can I raise $'s from investors to fund the many clinical trials I need to run?

Its actually a really interesting project for me as it is pushing me hard, I know nothing about biology & I am leveraging AI a lot; it is incredibly impressive (I am an engineer).

Right now I am looking for:
- gut health experts: gastroenterologists looking to improve gut health with AI, I need some on my advisory board and to help with clinical trials
- potential investors: rich people & venture capitalists looking to invest in healthtech & AI
- people keen to put in a prototype sensor in thier toilet and help me collect more data from as many individuals as possible

So if you lot know of any and think they might be intersted in this project, tell them about it :) I'd be very appreciative of any support, even if you just register for our newsletter - that helps me convince investors that there is a big market for this product!
What a great project.....How doI register for your newsletter ?.
 
  • Like
Reactions: 2 users

ketauk

Emerged
Ketauk , if you’re looking for -

Right now I am looking for:
- gut health experts: gastroenterologists looking to improve gut health with AI, I need some on my advisory board and to help with clinical trials


Maybe reach out to Professor Barry Marshall , Ex Brainchip Advisory Board member , he knows a thing or two about gut health .
View attachment 89287


Thanks for the tip - I will try to connect with him on Linkedin - https://www.linkedin.com/in/barry-marshall-3241b756/
 
  • Like
  • Love
Reactions: 8 users

ketauk

Emerged
  • Like
  • Love
Reactions: 3 users
No matter what… yes, competition stimulates business… and this way the technology also finds broader application and acceptance. But it doesn’t make me happy if the competitors manage to offer market-ready products with their customers faster while we’re still talking about test phases, studies, and demos and trials…it’s almost midnight…we’re on the brink…
Only a couple of minutes to, by my reckoning.

 
  • Fire
  • Like
  • Wow
Reactions: 5 users
Every one of these trades is exactly 15 seconds apart. Nothing to see here. Carry on.
View attachment 89277
What the actual fuck thats so blatant, how is this not immediately punished? Like I would think they head software to track that easily? Whaaaat?
 
  • Like
Reactions: 5 users

IloveLamp

Top 20
What the actual fuck thats so blatant, how is this not immediately punished? Like I would think they head software to track that easily? Whaaaat?
You must be new to BRN 😂
 
  • Haha
  • Like
  • Love
Reactions: 22 users

Baneino

Regular
Every one of these trades is exactly 15 seconds apart. Nothing to see here. Carry on.
View attachment 89277
"That exactly what you see when the market is not allowed to decide anything. Micro-trades every 15 seconds, always 1 share, always the same price - that's not an investor, that's a bot making its rounds.
Such behavior is often seen when either: - the order book is tested for liquidity
artificial volume is to be generated (keyword wash trading)
the price is to be kept within a narrow range in a controlled manner
Is this illegal? Not necessarily. Is it transparent? Not necessarily.
But it is a cler indication that the 'free market' does not rule here, but that an algorithm is "playing for time.
Such patterns are nothing new at BRN - especially in phases in which the news situation is calm, but movement is being prepared under the surface. You don't have to be an aluhut to recognize this: It's not a coincidence, it's a system.
As always, only the next major movement will decide whether this is good or bad. Until then, don't get nervous - just recognize the game.
 
  • Like
  • Fire
  • Thinking
Reactions: 26 users

The Pope

Regular
Anil mankar likes this. 🥳
 
  • Haha
  • Like
  • Love
Reactions: 7 users
1754306300407.gif
 
  • Like
  • Fire
Reactions: 4 users
Top Bottom