BRN Discussion Ongoing

Hi @SERA2g,

no, we’re not only talking about semantics here, and neither do I agree with you on what you’ve termed “the overall outcome”, as the way you suggestively worded it by naming Intel’s neuromorphic processor first (“Mercedes have looked at both Loihi and Akida and as we know, utilised akida in the EQXX”) implies a deliberate decision against Loihi in favour of Akida, which appears not to align with what really occurred.

Frankly speaking, I find it rather bizarre that you replied to me without even bothering to take the time to re-read your 26 January 2022 HC post (that you reposted here on TSE on 7 February 2022), which you say you recall “being very sequential and clear that Intel came before Akida with respect to Mercedes”, even though I had specifically pointed out that the post in question did NOT contain any convincing evidence thereof. Merely a far-fetched assumption on your part that just doesn’t hold water.

Feel free to provide proper proof, though: hard evidence that the MB - Intel collaboration on NC predates the MB - BrainChip collaboration.

As far as I can tell, all available evidence points to Mercedes-Benz having worked with Akida since at least as early as October 2019 for in-car gesture recognition in combination with event-based cameras, and from at least mid-2020 onwards on powering hotword detection in the VISION EQXX, and that they only got access to Loihi months after they had already started working on the concept car that utilised Akida to make the “Hey, Mercedes” voice control system five to ten times more efficient than conventional voice control.

And if my timeline is correct, it means the prevailing narrative that is apparent in the FF quote I shared in my previous post, is fundamentally flawed, as it suggests that the Mercedes-Benz engineers favoured cool new kid on the block Akida over Loihi, although they had already invested years of research into Intel’s neuromorphic processor prior to being introduced to BrainChip (read: wasted a lot of money).

And yet you claim “But the narrative is still materially unchanged”?
No, @SERA2g, absolutely not.

When we apply Ockham’s razor, the simple and obvious reason why MB went with Akida to optimise the energy efficiency of keyword spotting in the EQXX appears to have been that it was the neuromorphic processor that was already available to them at the time - and they evidently ended up happy with what it accomplished and have verifiably shown continued interest in Akida over the following years.

Would they have been more/equally/less happy with Loihi at the time? We will never know. We do know, however, that consortium lead Mercedes-Benz picked Loihi over Akida (and over other neuromorphic processors) for the NAOMI4Radar project last year, and that they are happy with those results, too.

Project lead Gerrit Ecke, who as I mentioned earlier this week has now left MB to embark on a new adventure with German defense-tech startup Project Q, said the following about the concluded multi-partner project that ran from June 2024 to August 2025 and was partially funded (56%) by the German government:


View attachment 92172






Here is a (non-exhaustive) compilation of verifiable facts surrounding MB & NC, which some readers may find useful (additional links in my previous posts on this topic):

- By the time MB engineers started working on the EQXX drawing board around mid-2020 (roughly: date of reveal minus 18 months), they had already been evaluating Akida for at least 9 months or so (cf. Gunjan Gupta’s LinkedIn profile: https://www.linkedin.com/in/guptagunjan19/).

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-476278


- Approximately half a year after embarking on the EQXX concept car project, Mercedes-Benz joined the Intel Neuromorphic Research Community (INRC), which gave them access to Loihi.

Intel Labs publicly announced MB as a new corporate member of the INRC on 3 December 2020. There is no conclusive evidence that they were somehow evaluating Loihi through a collaborating partner any earlier than that.
In the following LinkedIn post, Magnus Östberg confirms that Mercedes-Benz became an INRC Research Member in 2020 only:

https://www.linkedin.com/posts/magnus-%C3%B6stberg_neuromorphic-ml-ai-activity-7159123320410423297-Y-aX?


- On New Year’s Eve 2023, @Pom down under spotted the CV of Arizona State University PhD student and former BrainChip ML intern Vishnu Prateek Kakaraparthi, which revealed that one of the things he had done between June 2023 and August 2023 while interning at BrainChip was that he “spearheaded the development of distracted driving technology, achieving energy and processing gains, positioning for potential project collaboration with Mercedes, and showcasing the capabilities of the AKD1000 in the automotive safety domain.”

When I revisited his LinkedIn profile to check out if he had ever added information on whether this potential collaboration eventually came to fruition (the answer is no, which to me suggests we may have lost out to a competitor at the time), I newly discovered that he had also worked as the lead on neuromorphic anomaly detection research for Mercedes Vision EQXX Concept, achieving 4x energy efficiency”.
He even lists this as one of his career highlights!

Hmhhh, that was presumably also something he accomplished during his 2023 BRN summer internship (cf. what I marked in yellow in his LinkedIn profile), which means said anomaly detection research would have taken place 1.5 years after the EQXX had been revealed on the world stage. Interesting…

At the same time, it gives us another hint that the potential collaboration project with MB on distracted driving technology may not have eventuated, as surely he would also have listed that under career highlights?!

https://www.linkedin.com/in/prateekvishnu/


View attachment 92173
View attachment 92174


- We also know that between February 2024 and February 2025 two working students from Uni Stuttgart (employed consecutively) - Sreelakshmi Rameshan and Krishnaprasad Thoombayil Satheesh - wrote their Master’s theses on topics involving comparisons between Akida, Loihi 2 and SynSense Speck - the more recent one was on experimenting with converting ANNs to SNNs for child presence detection as well as direct training of SNNs and deploying all those (I assume) SNNs to BrainChip, Intel and SynSense neuromorphic hardware. Unfortunately, that thesis hasn’t (yet?) been uploaded to the Uni Stuttgart online publications server, so we can’t check out the results of this benchmarking.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-476278


- In addition, we know that Mercedes has also been evaluating neuromorphic processors by Innatera and highly likely by Applied Brain Research (Chris Eliasmith’s University of Waterloo spin-out), and is also aware of Akida 2.0.

Plus, that Neurobus was working with Mercedes-Benz on ADAS sometime last year (Akida? Loihi? Possibly benchmarking both in combination with Prophesee event-based sensors? BrainChip and Intel were both listed as partners - alongside with Prophesee - on the Neurobus website before it was redesigned earlier this year).
In a July 2025 Neurobus job ad, Mercedes-Benz, however, no longer showed up in the list of partners they were working with at that time. Maybe the project was already concluded by the time Gregor Lenz stepped down from his role as CTO?

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-440033

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-441454

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-440033



- After a BRN retail shareholder had publicly expressed his hope on LinkedIn that MB would be implementing BrainChip technology into their consumer vehicles soon, Magnus Östberg responded by saying “We look at all possible solutions!”



- In October 2024, Mercedes-Benz not only announced
a) the partially government-funded NAOMI4Radar project (06/24-08/25) to optimise radar data processing in autonomous vehicles using NC and (according to project partner TWT GmbH Science & Innovation) to demonstrate the industrial applicability of Loihi 2 (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-473794)
as well as
b) a cooperation with Karlsruhe University of Applied Sciences focusing on event-based cameras (Project EVSC = Event Vision Stream Compression, cf. https://www.h-ka.de/die-hochschule-.../kameratechnologien-im-neuromorphic-computing), but they also shared
c) that they had signed an MoU with the University of Waterloo “to collaborate on research led by Prof. Chris Eliasmith in the field of neuromorphic computing. The focus is on the development of algorithms for advanced driving assistance systems […] While preserving vehicle range, safety systems could, for example, detect traffic signs, lanes and objects much better, even in poor visibility, and react faster. Neuromorphic computing has the potential to reduce the energy required to process data for autonomous driving by 90 percent compared to current systems.”
https://group.mercedes-benz.com/company/news/open-innovation-canada.html

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-443917


- In December 2024, Mercedes-Benz invited journalists to an Open Day at their Future Technolologies Lab, where they briefed them on promising innovations from their “early-tech kitchen”, including on the potential benefits of neuromorphic computing. In this context, they made it clear that they consider NC as part of a research project, and that they were expecting such neuromorphic hardware to become available in the 2030s…

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-442024

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-442165


This time frame was also confirmed by Mercedes-Benz to a TSE forum member on LinkedIn on 29 January 2025:

“… we’re currently looking into neuromorphic computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years”.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-449452



- October 2025: According to Athos Silicon Co-Founder and CTO François Piednoël, who used to be the former mSoC Chief Architect with Mercedes-Benz North America until (Northern hemisphere) spring, Akida does not pass minimum requirements for Level 3 and Level 4 automated driving.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-476147


- We do not (yet) know what neuromorphic technology is hiding under the bonnet (or in the trunk) of the Vision Iconic, the show car revealed at the Shanghai Fashion Week on Tuesday. But since it is described as having Level 4, it seems rather unlikely to me it will be Akida, at least with regard to any safety-critical functions (cf. François Piednoël’s above comment).



Did I miss anything of importance? And please correct me should anyone spot any mistakes.
I could be wrong, but why would Akida need to be Level 4 Certified?
We are IP. Wouldn't the chip developed by MB (if we are in it)
be required to achieve certification?

SC
 
  • Like
Reactions: 5 users

Frangipani

Top 20
I could be wrong, but why would Akida need to be Level 4 Certified?
We are IP. Wouldn't the chip developed by MB (if we are in it)
be required to achieve certification?

SC

Good question. I suppose you would need to ask François Piednoël directly whether what he means is possibly that there is an Akida-inherent problem, which would also concern the IP. (The BRN shareholder on LinkedIn was asking “Why isn’t Mercedes using Akida technology, for example?”, referring to Akida technology in general, not only to physical chips such as AKD1000 or AKD1500).

Apart from that, since Athos Silicon has so far not signed any IP deal with us, we can’t be in the first mSoC silicon codenamed Polaris anyway that he was referring to in this recent video:


ED256247-2B9A-4C3B-B1D8-F34FCA0F6886.jpeg




Athos Silicon: Multiple System-on-Chip for Safe Autonomy​

By Luke Forster


Published

1 October 2025
Written by Luke Forster






Building functional safety into compute for autonomy​

Athos Silicon, a spin-out from Mercedes-Benz, is addressing one of the most pressing challenges in autonomous systems: how to eliminate the single point of failure in compute architectures. Unlike traditional monolithic processors that can collapse if a single component fails, Athos Silicon’s Multiple System-on-Chip (mSoC) integrates redundancy directly into silicon.

The result is a functionally safe Processor platform designed to meet ISO 26262 and other standards required for safety-critical applications.

Why safety-first design is essential​

Conventional computing platforms – with a CPU, GPU, and NPU working together – were never built for Automotive safety. If a processor crashes or a transient error occurs, the entire system may fail. In a consumer PC this means a reboot; in a self-driving vehicle or industrial robot, it could mean disaster.
Athos Silicon has rethought this architecture from the ground up. By focusing on functional safety as a primary design constraint, its mSoC avoids the patchwork redundancy of external systems and instead bakes resilience into the hardware itself.

The mSoC architecture explained​

Athos Silicon’s mSoC integrates multiple chiplets into one package, each containing CPUs, controllers, and memory. Instead of a single supervisor chip that itself could fail, the mSoC operates through a voting mechanism — what Athos calls a “silicon democracy.”

Each chiplet executes tasks in parallel, and their outputs are compared in real time. If one diverges from the others, it is overruled and reset. This ensures continuous operation without interruption and prevents cascading system failures.

By embedding this redundancy, Athos Silicon enables High Reliability computing suitable for Level 3 and Level 4 autonomy while maintaining predictable performance.

Power efficiency for EVs and robotics​

Safety is not the only benefit. In electric vehicles, compute power directly affects range. Athos Silicon highlights that every 100 watts of compute load can reduce EV range by as much as 15 miles. By designing a chiplet system optimised for Low Power efficiency, the mSoC reduces unnecessary energy consumption and makes autonomy more practical for battery-powered platforms.

From Mercedes-Benz R&D to startup scale​

The technology behind Athos Silicon was incubated within Mercedes-Benz before the company was spun out to bring the platform to the wider market.
Its first silicon, codenamed Polaris, is designed to deliver Level 3 and Level 4 autonomous capability in a footprint comparable to current Level 2 hardware.
Working with chiplet-packaging partners, Athos Silicon has accelerated validation and plans to deliver silicon to early customers soon. With no competitors currently offering integrated voting redundancy in a chiplet-based compute platform, Athos Silicon is carving out a unique position in the AI ecosystem.

Applications beyond cars​

While autonomous driving is the most visible use case, Athos Silicon’s architecture also applies to Robotics, avionics, and even Medical devices where safety and reliability are paramount. Any system requiring certifiable, functionally safe compute stands to benefit.

By combining chiplet redundancy, real-time voting, and safety-first design, Athos Silicon’s Multiple System-on-Chip may prove to be the missing hardware foundation for truly certifiable autonomy.


This is what the Polaris mSoC will roughly look like sizewise (watch from around 10:50 min):

1BDD25B0-45CC-455D-B5A4-C4D445DE5614.jpeg


According to François Piednoël, “Project mSoC” as such started in 2020 (still under Mercedes Benz North America R&D).

Not sure what exact date the interview was recorded, but given that Athos Silicon as a Mercedes-Benz spin-off has been around since April 2025, and in the video it is being said the company is about four months old, it must have been recorded sometime between late July and early September.

So when François Piednoël says “In fact, there is silicon coming back shortly. By the end of the summer we’ll have the chiplets in hands” (from 9:06 min), this means they would have received them by now, if everything went according to plan. (“We think we are in good shape for a startup - getting your silicon after, you know, five six months (…) With no visible competition, by the way.”)

He also says, they invented the IP.
 
Last edited:
  • Fire
  • Like
  • Love
Reactions: 4 users
- On New Year’s Eve 2023, @Pom down under spotted the CV of Arizona State University PhD student and former BrainChip ML intern Vishnu Prateek Kakaraparthi, which revealed that one of the things he had done between June 2023 and August 2023 while interning at BrainChip was that he “spearheaded the development of distracted driving technology, achieving energy and processing gains, positioning for potential project collaboration with Mercedes, and showcasing the capabilities of the AKD1000 in the automotive safety domain.”




Did I miss anything of importance? And please correct me should anyone spot any mistakes.
Yes you missed the part that I can’t remember anything I posted today let alone 2 years ago 😂😂
 
  • Haha
Reactions: 3 users

Diogenese

Top 20
Hi Frang

I haven't looked back at my original post but do recall if being very sequential and clear that Intel came before Akida with respect to Mercedes.

Your find of Gunjan Gupta could well flip the timeline and therefore change the narrative slightly. Ie. They tried akida, then Loihi.... But the narrative is still materially unchanged.

That narrative being: Mercedes have looked at both Loihi and Akida and as we know, utilised akida in the EQXX.

Correct me if I'm wrong but it seems to me we're talking about semantics here and agree on the overall outcome?
Hi SERA,

The Akida 1 evaluation boards started shipping in November 2020, the shared wafer:

BrainChip Ships Akida™ Evaluation Boards - BrainChip


This was the 1-bit version, which was upgraded to 4-bit in the production version which began production in April 2021:

BrainChip Begins Volume Production of Akida AI Processor - BrainChip

San Francisco, April 13, 2021 — BrainChip Holdings Ltd. (ASX: BRN), a leading provider of ultra-low power, high-performance AI technology, announced today that it has begun volume manufacturing of its Akida™ AKD1000 neuromorphic processor chip for edge AI devices.

Testing was completed in November 2021:

https://brainchip.com/brainchip-completes-testing-production-akida-chips/

BrainChip Completes Testing Production Version of the Akida Chip​

Latest iteration has been optimized for lower power consumption than the original engineering samples

Aliso Viejo, Calif. – 8 November, 2021BrainChip Holdings Ltd (ASX: BRN), (OTCQX: BCHPY), a leading provider of ultra-low power, high-performance artificial intelligence technology and the world’s first commercial producer of neuromorphic AI chips, today confirmed that functionality and performance testing of the AKD1000 production chips has been completed, which showed better performance than the original engineering samples.

It's possible that some chips were rushed through testing for EAPs.

However, all that was pre-TENNs.

Given that EQXX development started about 18 months before 2022, say mid-2020, and came out in early 2022, MB could not have had a production version much before mid-2021. They could have had the simulation software, and may have had the evaluation version at the end of 2020. I don't have any info on a FPGA version, but the Akida Development Environment was released in July 2018 using software simulation.


https://brainchip.com/brainchip-unveils-the-akida-development-environment-brainchip-240718-01/

SAN FRANCISCO, July 24, 2018 (GLOBE NEWSWIRE) — BrainChip Holdings Ltd. (“BrainChip” or the “Company”) (ASX:BRN), the leading neuromorphic computing company, today announced the availability of the AkidaTM Development Environment. The Akida Development Environment is a machine learning framework for the creation, training, and testing of spiking neural networks (SNNs), supporting the development of systems for edge and enterprise products on the Company’s Akida Neuromorphic System-on-Chip (NSoC).
...
Akida Execution Engine
The Akida Execution Engine is at the center of the framework and contains a software simulation of the Akida neuron, synapses, and the multiple supported training methodologies. Easily accessed through API calls in a Python script, users can specify their neural network topologies, training method, and datasets for execution. Based on the structure of the Akida neuron, the execution engine supports multiple training methods, including unsupervised training and unsupervised training with a labelled final layer
.


In 2018, BRN released an update for the Brainchip Studio software, the precursor for the Akida simulation software, the BRN Accelerator chip having been released in late 2017:

https://brainchip.com/new-release-o...rt-and-an-api-for-easier-systems-integration/

SAN FRANCISCO, April 04, 2018 (GLOBE NEWSWIRE) — BrainChip Holdings Ltd. (“BrainChip” or the “Company”) (ASX:BRN), a leading developer of software and hardware accelerated solutions for advanced artificial intelligence (AI) and machine learning applications, today announced an upgraded release of BrainChip Studio, version 2018.1. BrainChip Studio is an AI-powered video analysis software suite delivering high-speed object search and facial classification for law enforcement, counter terrorism and intelligence agencies. New features of the 2018.1 release make it easier to find objects from a variety of camera views, enable large-scale Linux deployments, and add an API that simplifies integration with other applications.

BrainChip Studio 2018.1 auto-generates rotated models. The software’s one-shot object training, a unique characteristic of spiking neural networks, creates a spiking neural network model of an object in its initial captured orientation. With the new auto-rotation feature, BrainChip Studio will automatically create multiple rotated models, improving the ability to locate the object in other camera views, where the orientation may vary depending on the installation
.

It is also worth noting that Intel is part of the NAOMI4Radar consortium:

NAOMI4Radar project develops energy-efficient radar sensors - AEEmobility Naomi4Radar

NAOMI4Radar project develops energy-efficient radar sensors

2. Dezember 2024 English Content

In the NAOMI4Radar project, a research team from the University of Lübeck led by Prof. Sebastian Otte is working together with Mercedes-Benz AG, TWT GmbH Science & Innovation, Intel Deutschland GmbH and the Technical University of Munich on an energy-efficient radar sensor system. The use of neuromorphic computing and spiking neural networks (SNNs) is intended to optimize battery life, shorten reaction times and increase safety. The project is being supported by the project sponsor TÜV Rheinland
 
Last edited:
  • Like
  • Fire
Reactions: 5 users

CHIPS

Regular
Didn't Tony Lewis say somewhere that something big is coming at the end of summer? Which summer? Australian summer or US summer?
Does somebody have the screenprint of this saying? I can't find it.
 
  • Like
Reactions: 2 users

Frangipani

Top 20
Yes you missed the part that I can’t remember anything I posted today let alone 2 years ago 😂😂

Well, that probably explains the photo of that negative COVID test you posted back in late April, when you must have forgotten that the coloured band showing up in the control region C does not mean you have COVID, but merely tells you whether or not the antigen rapid test was done correctly! 😉
(Sorry, only saw that days later at the time - by that time it was too late to let you know…)

For it to be positive for SARS CoV2, it would have shown a second coloured band in the N section. (The brands I use here in Germany have the letter T instead, which stands for test region. A/B in your combo test stands for Influenza A/B; quite often, these combo tests also test for RSV).

Then again, you can already have COVID symptoms and not yet have a positive test, so testing repeatedly when you have symptoms or know you’ve been in contact with someone who has COVID is always a good idea. Same goes for the flu.

Just thought it would be worthwhile posting for the benefit of everyone, now that the common cold / COVID / flu season in the Northern hemisphere is upon us…


FA8AA3A0-08EF-40ED-A4E5-9938E3FBA7AA.jpeg



BFC8AFFB-7067-455A-8611-DB6A55D14E4D.jpeg
 
  • Haha
  • Like
  • Thinking
Reactions: 6 users

CHIPS

Regular
Well, that probably explains the photo of that negative COVID test you posted back in late April, when you must have forgotten that the coloured band showing up in the control region C does not mean you have COVID, but merely tells you whether or not the antigen rapid test was done correctly! 😉
(Sorry, only saw that days later at the time - by that time it was too late to let you know…)

For it to be positive for SARS CoV2, it would have shown a second coloured band in the N section. (The brands I use here in Germany have the letter T instead, which stands for test region. A/B in your combo test stands for Influenza A/B; quite often, these combo tests also test for RSV).

Then again, you can already have COVID symptoms and not yet have a positive test, so testing repeatedly when you have symptoms or know you’ve been in contact with someone who has COVID is always a good idea. Same goes for the flu.

Just thought it would be worthwhile posting for the benefit of everyone, now that the common cold / COVID / flu season in the Northern hemisphere is upon us…


View attachment 92178


View attachment 92180

It always helps to read the instructions 😂😂😂
 
  • Haha
Reactions: 2 users
  • Haha
  • Fire
  • Like
Reactions: 4 users
Hiring PHD position.

 
  • Like
  • Love
  • Fire
Reactions: 3 users
 
  • Like
Reactions: 1 users
  • Like
Reactions: 1 users
Top Bottom