BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Mercedes is still listed as customer… question is if they want to improve their own AI with this move or we are in the game… time will tell

I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 18 users
I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.
Here is a comment from Linkedin regarding heterogeneous and brainchip
 

Attachments

  • Screenshot_20251109_174915_LinkedIn.jpg
    Screenshot_20251109_174915_LinkedIn.jpg
    287.8 KB · Views: 97
  • Like
Reactions: 8 users

7für7

Top 20
  • Like
  • Thinking
Reactions: 2 users

7für7

Top 20
I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.

It will be also interesting what they will bring out here


And this.. in which Mercedes has invested and is stakeholder

 
  • Fire
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
It will be also interesting what they will bring out here


And this.. in which Mercedes has invested and is stakeholder



Blaize is being evaluated by Mercedes for higher-end ADAS L4-type platforms, with an automotive-grade chip not expected in production before 2028 (see below excerpt from EETimes).

With Blaize there doesn't seem to be any sign of on-chip learning, whereas BrainChip explicitly markets on-device learning/continual learning as a core feature.

I suppose Mercedes is hedging across different layers of their future compute stack, but if they need something truly self-learning and event- driven, then BrainChip is looking good IMO.



Extract
Screenshot 2025-11-15 at 7.05.07 pm.png



 
Last edited:
  • Like
  • Love
Reactions: 7 users

7für7

Top 20
Blaize is being evaluated by Mercedes for higher-end ADAS L4-type platforms, with an automotive-grade chip not expected in production before 2028 (see below excerpt from EETimes).

With Blaize there doesn't seem to be any sign of on-chip learning, whereas BrainChip explicitly markets on-device learning/continual learning as a core feature.

I suppose Mercedes is hedging across different layers of their future compute stack, but if they need something truly self-learning and event- driven, then BrainChip is looking good IMO.



Extract
View attachment 93048




Finger-crossed they realise that they need us .. no matter what they want to reach in this segment…! And let’s hope they don’t pull out some rabbit out of the hat 🎩
 
  • Like
  • Thinking
Reactions: 2 users
Finger-crossed they realise that they need us .. no matter what they want to reach in this segment…! And let’s hope they don’t pull out some rabbit out of the hat 🎩
1763196514352.gif
 

7für7

Top 20
Would be nice if they could see a usecase to optimise their processes via Akida also in this field… we know Akida is fantastic in objekt recognition and we could also help in improving manufacturing Prozesses and so on… thoughts on that?


English

 
  • Thinking
Reactions: 1 users

7für7

Top 20
It’s old but interesting though


Event-based Vision Sensors (“EVS”) are said to have been modeled after human optic nerves. Completely different from previous frame-based image sensors, EVS can perceive only changes that have occurred in subjects, in a form like a path or trajectory, so they do not use the concept of “frames” in which sensing targets are captured as images, and have the characteristic of being able to monitor even objects moving at high speed in real-time. Also, in keeping with “their origin of being modeled after optic nerves, EVS are anticipated to function as the eyes of AI or robots, and it is believed that if they can be commercialized, they will allow robots to make judgments and decisions even more quickly. These are the EVS that have been developed by the Sony Semiconductor Solutions Group (“SSStheour Group”) in collaboration with PROPHESEE, which are expected to breathe new life into various fields including the industrial machinery sector. Their development involved difficulties and discoveries that were only possible precisely because they are entirely new types of vision sensors.”

 
  • Thinking
  • Fire
  • Like
Reactions: 3 users

manny100

Top 20
For auto safety features focused on driver monitoring and adaptive sensing, Akida with on-chip learning is far more suitable than Blaize.
For other features Blaize may be more suitable.
It may be that Mercedes is not ready yet to let AKIDA safety features on the road and is still testing it.
I would not in any way expect that AKIDA will be the only chip in the car. It's likely it will be horses for courses.
I think ' adaptable on chip learning' would be compulsory for auto safety.
 
  • Like
Reactions: 2 users

7für7

Top 20
For auto safety features focused on driver monitoring and adaptive sensing, Akida with on-chip learning is far more suitable than Blaize.
For other features Blaize may be more suitable.
It may be that Mercedes is not ready yet to let AKIDA safety features on the road and is still testing it.
I would not in any way expect that AKIDA will be the only chip in the car. It's likely it will be horses for courses.
I think ' adaptable on chip learning' would be compulsory for auto safety.

It would already be enough if they implemented Akida just to reduce energy-hungry systems so EVs and plug-ins can achieve a greater range. We can support many different use cases – it depends on how each customer wants to use Akida. We doesn’t need to be implemented in security relevant systems like autonomous driving and stuff like this…
For example

Reducing compute load for sensors (cameras, radar, lidar, interior monitoring) I don’t mean to run the system… just optimising the Prozess

Event-based processing instead of continuous full-load processing

“Guardian” function: Akida runs lightweight and only wakes up CPU/GPU when needed

Less or more efficient “cloud”/data communication (more on-device AI)

Lower overall power consumption of ECUs and SoCs

Less heat generation…less cooling effort for electronics

More efficient energy management in the vehicle (driving profile, recuperation, HVAC)

Ability to use smaller/more efficient power and board-net components

Overall reduction of continuous electronic power draw… more range for EVs

I think if the automotive sector would use us only in those points, we would win a lot of trust at the industry
 
Last edited:
Top Bottom