BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Mercedes is still listed as customer… question is if they want to improve their own AI with this move or we are in the game… time will tell

I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 14 users
I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.
Here is a comment from Linkedin regarding heterogeneous and brainchip
 

Attachments

  • Screenshot_20251109_174915_LinkedIn.jpg
    Screenshot_20251109_174915_LinkedIn.jpg
    287.8 KB · Views: 74
  • Like
Reactions: 5 users

7für7

Top 20

7für7

Top 20
I personally think Akida 2 fits the Mercedes + Samsung + LG direction exceptionally well.

If Mercedes is shifting toward a heterogeneous compute architecture for next-gen AI-defined vehicles (which the article strongly suggests) then they’ll need both heavyweight compute for generative models and an ultra-low-power layer for continuous sensing. That second layer is exactly where Akida excels.

In-cabin AI will be multimodal and partly generative, combining voice, gaze, gesture, sound events and cabin behaviour. A single large SoC running all of that 24/7 isn’t practical because it would drain power, create heat and require the system to stay constantly awake.

With Akida 2 + TENNs now being positioned as “edge GenAI,” the fit becomes even clearer IMO. Future cabins will rely on small, efficient SLMs that can run locally for privacy, latency and energy reasons. Natural voice interaction, context-aware assistance or conversational feedback from a Samsung or LG in-cabin system won’t require huge LLMs, they'd rely on compact generative or state-space models, which is precisely what Akida 2 is built for.

So while Samsung and LG will handle the heavy compute and platform electronics, Akida could take on a unique, self-learning, sub-watt always-on intelligence that can watch, listen and adapt without waking the main processor. That’s ideal for wake-word detection, anomaly detection, driver and cabin monitoring, and personalised behaviour.

If Mercedes does move to a multi-vendor AI stack, there’s potential for BrainChip’s neuromorphic tech to become the specialised piece that ties the whole system together. At least, this is what I'm hoping.

IMO. DYOR.

It will be also interesting what they will bring out here


And this.. in which Mercedes has invested and is stakeholder

 

Bravo

If ARM was an arm, BRN would be its biceps💪!
It will be also interesting what they will bring out here


And this.. in which Mercedes has invested and is stakeholder



Blaize is being evaluated by Mercedes for higher-end ADAS L4-type platforms, with an automotive-grade chip not expected in production before 2028 (see below excerpt from EETimes).

With Blaize there doesn't seem to be any sign of on-chip learning, whereas BrainChip explicitly markets on-device learning/continual learning as a core feature.

I suppose Mercedes is hedging across different layers of their future compute stack, but if they need something truly self-learning and event- driven, then BrainChip is looking good IMO.



Extract
Screenshot 2025-11-15 at 7.05.07 pm.png



 
Last edited:
Top Bottom