BRN Discussion Ongoing

manny100

Regular
Does anyone know the answer to this Question:
From the news release today:
" BrainChip will also demonstrate its edge LLM model based on Temporal Enabled-Neural Networks (TENNs) at the event."
Was the LLM model included in the original TENN's release or added shortly after or is it a 'brand new' addition we were not aware of?
Is it the " new invention" Tony Lewis referred to in his Linkedin post?
 
  • Like
  • Thinking
Reactions: 6 users

rgupta

Regular
Hi Slade,

It's a roller-coaster mate....are we in or are we out, are we ??

So much speculation, and I wonder why, we have always had so many questions about how and why our company make certain
decisions, lets be honest here, Sean, as far as he is concerned is only 3.5 years into building a companies credibility with customers,
potential or otherwise, I think the Board is strong enough to pull Sean into line if they think he's leading the company down a dead
end ally, but like you I'd suggest, we all wish to see something more solid by 4 years into a 5 year plan, so I guess he's expecting
fireworks over the next 9 months !

On a very positive note, Peter told me awhile ago when Tony was about to takeover his role that, Tony was a go-getter, he doesn't
mess around, this to me is exactly what I've witnessed, he is pretty active on Linkedin, some of his comments are quite detailed, he
clearly is proud of where Peter and Anil left off and has taken the reins in a no-mucking around approach.

I like his style, and tomorrow, later in the day will be making a presentation at Nuremberg, he has a different type of personality than
Peter or Anil, who are both reserved, and I say that with the utmost respect, I love Peter and Anil, true gentlemen and the true face of
Brainchip forever.

I have a question for some whom have commented about the move to the US market, if it's not a Nasdaq Listing, why bother ?

My views are clear, convince me why this move is key by years end, early 2026, I'd like Antonio to sell me the idea, let's be honest,
he's the gun IP salesman ! Apparently !

No Regerts (bloody tattoo).......Tech :ROFLMAO::ROFLMAO:(y)
The more I think more I feel this company is very good for traders and is equally bad for investors. You could had made millions by trading this stock, but you will end up frustrated as an investor. To be Frank the stock was not ready to listing 10 years ago. Since then there are a lot of assurances but very little achievements. But see it goes from 20 cents to 90 cents in a few days and then back to 30 cents, again 70 cents to 2.34 in a couple of weeks and then back to 13 cents and in between there was nothing but news and a momentum created by traders.
The last announcement of moving to US without a plan almost broke the long term investors in this company. It only shows they have no regard to someone who believe in them.
So let us stay tuned and see what else they are saying. Definately exiting at these low SP will only a suicide but who knows what is there in future.
Dyor
 
  • Like
  • Fire
Reactions: 8 users

manny100

Regular
Does anyone know the answer to this Question:
From the news release today:
" BrainChip will also demonstrate its edge LLM model based on Temporal Enabled-Neural Networks (TENNs) at the event."
Was the LLM model included in the original TENN's release or added shortly after or is it a 'brand new' addition we were not aware of?
Is it the " new invention" Tony Lewis referred to in his Linkedin post?
Ok, i will make a call.
The LLM model based TENNs very much looks like an addition to BRN's original TENNs framework. The initial release of TENNs focused on efficiently processing spatiotemporal data. LLM's to my knowledge was not on the menu at that time - in a big way anyway.
So it fits in with TL's " new invention " comments.
Correct me if I am wrong.
 
  • Thinking
  • Like
Reactions: 3 users

manny100

Regular
or is the 'new invention' the" Akida 2 FPGA platform" that will be working with Prophesee at the demonstration?
 
  • Like
  • Thinking
Reactions: 4 users

7für7

Top 20
or is the 'new invention' the" Akida 2 FPGA platform" that will be working with Prophesee at the demonstration?
Let me make a call…

WE HAVE NO FU…IN IDEA WHATS GOING OoOOONNN WE LOST THE BAAAAALLLLLL


1741676372112.gif
 
  • Like
  • Haha
Reactions: 5 users
Ok, i will make a call.
The LLM model based TENNs very much looks like an addition to BRN's original TENNs framework. The initial release of TENNs focused on efficiently processing spatiotemporal data. LLM's to my knowledge was not on the menu at that time - in a big way anyway.
So it fits in with TL's " new invention " comments.
Correct me if I am wrong.
"Keep going Manny I'm still here!"

robert-pine-cpr.gif

BrainChip forum.
 
Last edited:
  • Haha
  • Like
Reactions: 9 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Well Larry bought more today rounding off nicely with 300k shares now...........

giphy (23).gif


Happy as Larry
 
  • Like
  • Love
  • Fire
Reactions: 34 users

manny100

Regular
  • Haha
Reactions: 4 users
Well Larry bought more today rounding off nicely with 300k shares now...........

View attachment 78999

Happy as Larry
300 thousand shares, really is a nice round number Larry.

That's a "Let's get serious" holding.

Isn't that what you have 7? 🤔..
 

TheDrooben

Pretty Pretty Pretty Pretty Good
300 thousand shares, really is a nice round number Larry.

That's a "Let's get serious" holding.

Isn't that what you have 7? 🤔..
Not as big a holding as some on here but for Larry......

giphy (25).gif


Happy as Larry
 
Last edited:
  • Like
  • Haha
  • Love
Reactions: 14 users

7für7

Top 20
300 thousand shares, really is a nice round number Larry.

That's a "Let's get serious" holding.

Isn't that what you have 7? 🤔..
No 50tsd less .. still … who the fu… knows where I wil end up
 
  • Haha
  • Fire
  • Like
Reactions: 3 users
Yes, already posted a lot earlier but stay with me....


BrainChip Demonstrates Event-based Vision at Embedded World 2025


Excerpt:

BrainChip’s Akida technology demonstrates the possibilities of embedded AI. As part of its exhibition at Embedded World, the company will showcase the benefits of low-latency and ultra-low power consumption for gesture recognition using the Akida 2 FPGA platform in conjunction with the Prophesee EVK4 development camera. Unlike other approaches, the combination of Prophesee’s event-based vision sensors with Akida’s event-based computing can capture extremely high-speed movement with high sparsity so that only information relevant to the gesture is processed, enabling faster response times. These computer vision systems open new potential in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance and AR/VR.

Integrating Prophesee event-based vision sensors with Akida's event-based processing will enable the development of new, compact SWaP (Size, Weight, and Power) form factors, unlocking fresh product opportunities in the market.

“By combining our technologies, we can achieve ultra-high accuracy in a small form factor, empowering wearables and other power-constrained platforms to incorporate advanced video detection, classification, and tracking capabilities,” said Etienne Knauer, VP Sales & Marketing at Prophesee. “Processing our event-based sensor data streams efficiently leverages their sparse nature, reducing computational and memory demands in the final product.”



If only we were here in the background processing...maybe a near term update opp down the track :unsure:


IDS Unveils uEye EVS: High-Speed Industrial Camera Powered by Prophesee Event-Based Vision

Visit https://en.ids-imaging.com/ueye-evs-cameras.html for further information
New uEye EVS camera integrates the Prophesee-Sony IMX636 sensor, enabling ultra-fast, efficient machine vision with event-based sensing.

03/06/25, 01:22 PM | Industrial Robotics, Factory Automation | IDS Imaging Development Systems Inc.

IDS Imaging Development Systems GmbH, market leader in industrial machine vision, and Prophesee SA, inventor of the most advanced neuromorphic vision systems, today announced that IDS' new uEye EVS camera line incorporates the high-speed, dynamic range and data efficiency of the Prophesee-Sony IMX636HD event-based vision sensor to offer new capabilities for industrial machine vision applications.

The result of extensive collaboration between the two companies, the solution features Prophesee's proven neuromorphic approach to capturing fast-moving objects with significantly less data processing, power and blur than traditional frame-based methods. With these capabilities, the uEye EVS camera is the ideal solution for applications that require real-time machine vision processing at very high speed, such as optical monitoring of vibrations or high-speed motion analyses.
 
  • Like
  • Thinking
  • Fire
Reactions: 16 users

Diogenese

Top 20
Ok, i will make a call.
The LLM model based TENNs very much looks like an addition to BRN's original TENNs framework. The initial release of TENNs focused on efficiently processing spatiotemporal data. LLM's to my knowledge was not on the menu at that time - in a big way anyway.
So it fits in with TL's " new invention " comments.
Correct me if I am wrong.
Hi Manny,

The LLM is equivalent to a database of infromation which Akida "consults". It is software/data, not hardware. I'm not up to speed on the ins and outs of how TENNs works, but, leaving TENNs aside, if the LLM were used by Akida, Akida would be "configured" into a number of layers with an optimal number of SNN NPUs per layer, each NPU being "programmed" with a weight derived from the LLM.

MAC-addicted Intel uses "transformers" to operate LLMs, as does OpenAI.

https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

TENNs does use MACs, but very small ones. TENNs superpower is doing the temporal calculations, previously done in CPU software, in silicon. TENNs also does inference in silicon, so combining temporal with inference (classification) in silicon enables NLP as well as video object tracking.
 
  • Like
  • Love
  • Fire
Reactions: 26 users

7für7

Top 20
Is anyone from Germany at the exhibition in Nuremberg this week by any chance?
 
  • Fire
  • Like
Reactions: 3 users
@Diogenese you are the best qualified to answer this. what’s the chances Onsemi new product to be launch at embedded world is the product Tony Lewis is referring to. Podcast with Onsemi talked about how Akida assisted with Time Of Flight in Onsemi tech. Here they are on LinkedIn just launching a new product with ToF.
IMG_4874.png
 
  • Like
  • Fire
  • Love
Reactions: 20 users

The Pope

Regular
Please let us know if yo receive a reply. Would be greatly interested in the response.
I sent my detailed enquiry to Australian super tonight and should have a response within 5 business days.
 
  • Like
Reactions: 6 users

manny100

Regular
Hi Manny,

The LLM is equivalent to a database of infromation which Akida "consults". It is software/data, not hardware. I'm not up to speed on the ins and outs of how TENNs works, but, leaving TENNs aside, if the LLM were used by Akida, Akida would be "configured" into a number of layers with an optimal number of SNN NPUs per layer, each NPU being "programmed" with a weight derived from the LLM.

MAC-addicted Intel uses "transformers" to operate LLMs, as does OpenAI.

https://en.wikipedia.org/wiki/Transformer_(deep_learning_architecture)

TENNs does use MACs, but very small ones. TENNs superpower is doing the temporal calculations, previously done in CPU software, in silicon. TENNs also does inference in silicon, so combining temporal with inference (classification) in silicon enables NLP as well as video object tracking.
Thanks Dio, cheers
 

Esq.111

Fascinatingly Intuitive.
Evening Chippers,

This mob would be worth keeping an eye on.

German based & dabble in a bit of everything .

Purely on my radar due to looking at the German exchange...biggest movers.

Apparently picked up a contract with the Australian gov recently.









If we are not in bed with them yet , then why not.


Regards,
Esq.
 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 8 users

manny100

Regular
Yes, already posted a lot earlier but stay with me....


BrainChip Demonstrates Event-based Vision at Embedded World 2025


Excerpt:

BrainChip’s Akida technology demonstrates the possibilities of embedded AI. As part of its exhibition at Embedded World, the company will showcase the benefits of low-latency and ultra-low power consumption for gesture recognition using the Akida 2 FPGA platform in conjunction with the Prophesee EVK4 development camera. Unlike other approaches, the combination of Prophesee’s event-based vision sensors with Akida’s event-based computing can capture extremely high-speed movement with high sparsity so that only information relevant to the gesture is processed, enabling faster response times. These computer vision systems open new potential in areas such as autonomous vehicles, industrial automation, IoT, security and surveillance and AR/VR.

Integrating Prophesee event-based vision sensors with Akida's event-based processing will enable the development of new, compact SWaP (Size, Weight, and Power) form factors, unlocking fresh product opportunities in the market.

“By combining our technologies, we can achieve ultra-high accuracy in a small form factor, empowering wearables and other power-constrained platforms to incorporate advanced video detection, classification, and tracking capabilities,” said Etienne Knauer, VP Sales & Marketing at Prophesee. “Processing our event-based sensor data streams efficiently leverages their sparse nature, reducing computational and memory demands in the final product.”



If only we were here in the background processing...maybe a near term update opp down the track :unsure:


IDS Unveils uEye EVS: High-Speed Industrial Camera Powered by Prophesee Event-Based Vision

Visit https://en.ids-imaging.com/ueye-evs-cameras.html for further information
New uEye EVS camera integrates the Prophesee-Sony IMX636 sensor, enabling ultra-fast, efficient machine vision with event-based sensing.

03/06/25, 01:22 PM | Industrial Robotics, Factory Automation | IDS Imaging Development Systems Inc.

IDS Imaging Development Systems GmbH, market leader in industrial machine vision, and Prophesee SA, inventor of the most advanced neuromorphic vision systems, today announced that IDS' new uEye EVS camera line incorporates the high-speed, dynamic range and data efficiency of the Prophesee-Sony IMX636HD event-based vision sensor to offer new capabilities for industrial machine vision applications.

The result of extensive collaboration between the two companies, the solution features Prophesee's proven neuromorphic approach to capturing fast-moving objects with significantly less data processing, power and blur than traditional frame-based methods. With these capabilities, the uEye EVS camera is the ideal solution for applications that require real-time machine vision processing at very high speed, such as optical monitoring of vibrations or high-speed motion analyses.
Prophesee and Brainchip partnered in Jun'22. Plenty of time for deals to have been arranged. Hopefully we will get some news in the coming months.
 
  • Like
  • Wow
Reactions: 11 users
Remember this corker of a quote from Spencer Huang, Chief Revenue Officer at Edge Impulse when he was being interviewed by Nandan Nayampally, CMO at BrainChip in the January 2024 podcast.

"I really applaud BrainChip for your technology and your intellectual property and I see every silicon vendor, every device will have your technology or neuromorphic-type technology in it. AI accelerate. This is going to be the norm."

11.54 mins



Then there was also the January 2025 podcast where Spencer Huang talked about how AKIDA is pushing the boundaries of what is possible in edge AI and "making science fiction a reality".

So, if AKIDA is pushing the boundaries of what is possible in edge AI and Qualcomm want to dominate the edge AI market, then wouldn't it make sense for Qualcomm to want to collaborate with us?





View attachment 78968

B O O M 💥 @Bravo !
 
  • Like
  • Fire
  • Thinking
Reactions: 13 users
Top Bottom