BRN Discussion Ongoing

Frangipani

Top 20
Another recent addition to the BrainChip team is Ali Kayyam.

I like his ambition to contribute his expertise in machine learning and computer vision in order to solve real-world challenges (cf. his past work experience in diverse fields). That’s exactly what we need to show the world!


EF4D628D-86CC-4BEB-8DE1-354D95DFC69D.jpeg
3DBC0A02-5705-4500-A6CA-3B11F94A6A86.jpeg
8E5FDDFE-A366-4967-9A70-FB7356270909.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 20 users

Rach2512

Regular

Prophesee and Tobii partner to make event based eye tracking glasses, could this involve Akida?

Screenshot_20250523_205427_Samsung Internet.jpg
 
  • Like
  • Thinking
  • Love
Reactions: 15 users

TanCA

Emerged
It is not being released. The company have confirmed it is not being released as they do not have attendee permission (just an excuse not to release). They clearly realise the damage it would do, if everyone could see Antonio's disgraceful behaviour at the AGM.

Sometimes meetings are streamed but it is also common for them not to be. I therefore don't see it as a red flag as such.
 

TanCA

Emerged
Another recent addition to the BrainChip team is Ali Kayyam.

I like his ambition to contribute his expertise in machine learning and computer vision in order to solve real-world challenges (cf. his past work experience in diverse fields). That’s exactly what we need to show the world!


View attachment 85098 View attachment 85099 View attachment 85100

Yes, potentially. I find it difficult to gauge ambition from Linkedin profiles though. Everyone is putting their best foot forward and trying to sell themselves.
 
  • Like
Reactions: 1 users

wilson

Emerged
Hi @wilson,

sorry, I don’t have any technical background, so I need to rely on what we’re being told by our company or others in this field.

In that same CES 2025 interview I quoted from in my previous post, Steve Brightfield said the following:

“… and we announced this fall Akida Pico, which is microwatts, so we can use a hearing aid battery and have this run for days and days doing detection algorithms [?]. So it is really applicable to put on wearables, put it in eye glasses, put it in earbuds, put it in smart watches.” (from 4:18 min)







View attachment 85016


You may also want to have a look at this:



Hi

Thanks for the reply. I also do not come from a technical background. Seems like there is certainly potential in the technology though.
 
  • Like
Reactions: 2 users

manny100

Regular
Another recent addition to the BrainChip team is Ali Kayyam.

I like his ambition to contribute his expertise in machine learning and computer vision in order to solve real-world challenges (cf. his past work experience in diverse fields). That’s exactly what we need to show the world!


View attachment 85098 View attachment 85099 View attachment 85100
BRN attracts talented staff for the same reason we hold.
At the Pitt Street presentation during question time Sean said staff want to be part of something that changes the world and of course make a bit of money.
The talent is why we are the leader.
 
  • Like
  • Fire
Reactions: 6 users

Frangipani

Top 20
Only Akida actually has on-device learning. There has never been any proof of Loihi having the capability and IBMs tech has never been stated to.

Hi @jtardif999,

that is simply not true.

On a side note: Wouldn’t you expect neuromorphic researchers outside of Intel Labs having called out Mike Davies and his team over the past 6+ years if your claim were accurate?

Also, has any of our BrainChip staff ever contested the claim that Loihi has on-chip learning capabilities? (https://redwood.berkeley.edu/wp-content/uploads/2021/08/Davies2018.pdf)
No? I wonder why.

But let the facts speak for themselves…

Here is a video by fortiss, the Research Institute of the Free State of Bavaria for Software-Intensive Systems, whose researchers have been working with Loihi and SpiNNaker for years and recently also partnered with both BrainChip and Innatera (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-454015).

The gesture recognition demo shown is run on Loihi 1.

“What we do here is on-the-fly learning from few examples - so, in the gesture case just a few seconds of presenting time - and as we want to implement this on neuromorphic hardware, we make use of the neuromorphic hardware’s plasticity processor, which enables learning with local signals, so implementation of local learning rules. In order to prevent catastrophic forgetting in the network, we only update the weights of the last layer of the network.

(…) So using this learning rule, we can implement on-chip adaptation to new classes. And as this happens directly on the neuromorphic processor, it can be implemented with only a very low latency and energy overhead. And this means that we can learn new classes in real-time during inference without interrupting the inference for specific learning procedure [sic].


(…) The third stage is the rapid on-chip learning layer, which is neuromorphic processor’s plasticity layer, and it enables learning only from short demonstrations, which can enhance accuracy and also enable personalisation of gestures.”

81D75501-A0CF-4548-9720-42DD6A72C7EB.jpeg







As for the others, they have never been boasted as having on device learning. Just because they may have been touted as neuromorphic doesn’t automatically mean they have any learning on chip! The learning associated with Akida is due to its architecture incorporating the JAST learning rule with STDP. Nothing else on the planet currently has this capability and BrainChip own the IP and patents. I am not aware of any technology other than BrainChips that actually doesn’t do anything more than brute force training off chip followed by inference on chip, that is why I’m fully invested in BrainChip and will continue to be so for as long as it takes. Albeit it’s taking bloody long… 🫤

This claim of yours is equally false.

While the individual on-chip learning capabilities differ from chip to chip (some of them seem to be very basic, although this is all far too technical for me to understand) and none of these capabilities are exactly like those of Akida (which certainly makes it unique in that sense), there are plenty of neuromorphic chips around that are claimed to have the capability to learn on-the-fly during inference, without having to be retrained in the cloud.

A Google search will bring up numerous examples, such as…


ABDBE1CE-B379-4F5D-97D5-11F098322020.jpeg
00EF3925-E65B-431A-AA06-B0695E21ED20.jpeg

BE6BF4AC-28AC-4F1C-B3DD-42E6CEF7C854.jpeg


753CC77D-4C01-43DD-8EE6-DA9A2EF96A32.jpeg



28030882-72BC-49BE-8A32-D29052091F56.jpeg
E7978496-2D51-4D97-810B-09B0AFACEDFF.jpeg





B2077EA0-3E90-428E-A4E2-B05DBBEA6251.jpeg



A1B7A86D-B76F-4523-BF0C-DFCCE9C24D3B.jpeg




7D713667-947D-49DE-B489-6E5E5360C0A3.jpeg
 
  • Like
  • Fire
Reactions: 2 users
Also Sean put the road map slide up and said every customer asks to see a roadmap showing plans going forward.
The customer is betting their business on AKIDA as a differentiator. Once you are in you stay in.
He said BRN was the Edge leader right now and the vision was that they be for the next 20 years.
By the time quantum computing is big we will all be 'dead and buried.
Incidentally Sean said BRN's biggest risk was that we do not get the roadmap right.
The roadmap follows trends in models that need support. At the moment the trend is LLMs and SSMs.
Pick the wrong horse and its a problem.
Thanks Manny
That's the most informative post I've seen in a while. That statement from Sean explains a lot about the background negotiations going on. Hope the customers are interested in cortical columns. Looks like that is where we are heading next. The good thing is it is what is called a.sticky tech. Once you have a customer you have them for a long time.

SC
 
  • Like
  • Fire
Reactions: 4 users

IloveLamp

Top 20

1000006079.jpg
 
  • Like
  • Fire
  • Love
Reactions: 7 users

Frangipani

Top 20
MISSING PARTNERS’ REPORT!

While the new website was being built, MulticoreWare must have suddenly said “Let’s go(,) MYW.AI” and must have taken the NEUROBUS to the extreme EDGX…


Under Webpage “Partners”:

08627E07-BD37-43BE-9F8C-F3D177A4E431.jpeg





vs. Landing Page:


4CFF1BF0-B8DA-41E2-9A83-1256F69F204A.jpeg
 

Attachments

  • 7ECF170C-FA33-4070-8C05-FAE6751179BC.jpeg
    7ECF170C-FA33-4070-8C05-FAE6751179BC.jpeg
    342 KB · Views: 5
  • Like
  • Fire
Reactions: 3 users
Top Bottom