BRN Discussion Ongoing

Rskiff

Regular
A good run down of where AI and the chips are going including on the edge.
 
  • Like
  • Fire
Reactions: 7 users

manny100

Top 20
I pointed out in my earlier post that Onsor demonstrates the lengthy adoption style and this is outlined on the Onsor website.
No doubt its been frustrating for all holders but unfortunately there are no shortcuts.
Directors.
"In practical terms, acting in a responsible and ethical manner towards stakeholders (such as customers, employees, community and the environment) is necessary for the promotion of the interests of the company and its sustainability, and builds long-term value for shareholders.
Of course, this does not mean that all stakeholders will be satisfied with every decision taken, given stakeholder interests will be diverse.
"
My bold above.
The BOD have to prioritise long term value for holders over short term value - its the horse we are riding..
As the timeframes to revenue are lengthy (we are closing in on the future though) the BOD has had to remain firmly focused on the long term. This the reason for LDA cap raisings - the alternative is and was going broke.
Another aspect is that case law is very firm that Directors fiduciary duty is the entity and not individual holders.
The cornerstone case was Percival v Wright (1902) - UK but applies here as much as it did in 1902.
Briefly a group of individual holders of Nixon’s Navigation Co. approached the Directors to buy their shares. The directors after negotiations purchased the shares. Unbeknown to the holders during negotiations the Directors were also negotiating sell the entire company to a third party (Holden) which would have substantially increased the share value - the negotiations subsequently fell through though.
The shareholders later sued alleging nondisclosure of the negotiations.
The shareholder's lost their case on the basis that Directors duties are to the entity and not individual holders.
The case pretty much lets holders know where they stand - its basically 'like it or lump it' when it comes to the BOD setting a course for the business.
 
  • Like
  • Wow
Reactions: 5 users

Rach2512

Regular

Screenshot_20251122_123204_Samsung Internet.jpg
 
  • Like
  • Haha
  • Love
Reactions: 15 users
As expected (and predicted), Markus Schäfer has posted one more time about neuromorphic computing in the context of the NAOMI4Radar project before stepping down as the Mercedes-Benz CTO at the end of this month:


7bab10c6-101b-4ec1-8d84-fd66a0238ad8-jpeg.93216
Isn’t this on loihi2?

 
  • Like
Reactions: 3 users

Fiendish

Regular
Would it be possible for car manufacturers to use both akida and loihi in a single vehicle?

Yes, it's not only possible but increasingly likely for a car manufacturer like Mercedes-Benz to integrate both the BrainChip Akida and Intel Loihi 2 neuromorphic processors into a single production vehicle, particularly for specialized edge AI tasks in advanced driver-assistance systems (ADAS), autonomous driving, and in-cabin monitoring.

This approach aligns with the automotive industry's trend toward heterogeneous computing architectures, where multiple specialized chips handle distinct workloads to optimize for power efficiency, latency, and performance—much like how modern vehicles already combine NVIDIA GPUs for high-level perception, Qualcomm SoCs for infotainment, and dedicated MCUs for sensors.

Why This Makes Sense for Different Purposes

Mercedes has demonstrated active use of both chips in separate projects, with complementary strengths that lend themselves to modular deployment:
Akida for Ultra-Low-Power In-Cabin and Edge Inference: Deployed in the 2022 Vision EQXX concept vehicle, the Akida processor excels at always-on, battery-sensitive tasks like keyword spotting ("Hey Mercedes"), voice authentication, and contextual in-cabin monitoring. It achieves 5–10x energy efficiency over traditional AI by processing spiking neural networks (SNNs) at microwatt-to-milliwatt levels, ideal for non-safety-critical, continuous operations that don't drain EV range. BrainChip positions Akida for broader sensor-edge integration, such as with partners like Prophesee for event-based vision.
Loihi 2 for Radar and Real-Time Safety

Processing: Integrated into the ongoing Naomi4Radar project (a German-funded initiative announced in 2024), Loihi 2 focuses on event-driven radar data optimization for forward-facing sensors in ADAS. It enables up to 90% energy reduction and microsecond-latency responses for detecting objects in poor visibility, mimicking brain-like spiking for faster, more adaptive autonomous driving decisions. This is geared toward safety-critical, high-reliability tasks requiring on-chip learning.

These aren't mutually exclusive; Mercedes' research collaborations (e.g., with Intel and the University of Waterloo) emphasize neuromorphic computing's scalability across vehicle subsystems. In a full vehicle stack:
Akida could handle distributed, low-power nodes near microphones and interior cameras.

Loihi 2 could sit closer to radar/LiDAR fusion modules for external perception.
Evidence from Industry Trends
Heterogeneous Integration in AVs: Autonomous vehicles routinely use multi-chip setups today—e.g., NVIDIA's DRIVE platforms combine Orin SoCs with dedicated accelerators for redundancy and specialization. Neuromorphic chips fit this model, with projections for the neuromorphic AV market reaching $59 billion by 2034 (20.7% CAGR), driven by sensor fusion and edge processing. Research from Infosys and IBM highlights neuromorphic modules embedded directly in sensors (e.g., one chip per radar unit), enabling parallel use without central bottlenecks.

Mercedes' Multi-Partner Approach: As of late 2025, Mercedes continues exploring Akida for voice/sensor AI while advancing Loihi in radar R&D, with no reported conflicts. Broader partnerships (e.g., BrainChip with Renesas/Valeo for automotive IP, Intel's Hala Point scaling Loihi) suggest OEMs like Mercedes will mix vendors to avoid single-supplier risks.

Precedents and Future Outlook: While no public confirmation exists for both in one Mercedes vehicle yet (the EQXX remains a concept), 2025 reports from AI News Hub and Global Market Insights note accelerating commercialization, with OEMs like Renault/Bosch adopting multiple neuromorphic types for fleet adaptability. By 2026–2027, as Level 3+ autonomy rolls out in Mercedes' EQS/EQV lines, combined deployment could become standard for 90%+ efficiency gains in data processing.

Challenges like software standardization (e.g., MetaTF for Akida vs. Lava for Loihi) and ASIL safety certification remain, but Mercedes' CTO has publicly endorsed neuromorphic for "breaking new ground" in architectures. If energy and latency demands intensify with EV adoption, this dual-chip strategy could extend range by 10–20% while enhancing safety.
 
  • Like
Reactions: 8 users

Guzzi62

Regular
Would it be possible for car manufacturers to use both akida and loihi in a single vehicle?

Yes, it's not only possible but increasingly likely for a car manufacturer like Mercedes-Benz to integrate both the BrainChip Akida and Intel Loihi 2 neuromorphic processors into a single production vehicle, particularly for specialized edge AI tasks in advanced driver-assistance systems (ADAS), autonomous driving, and in-cabin monitoring.

This approach aligns with the automotive industry's trend toward heterogeneous computing architectures, where multiple specialized chips handle distinct workloads to optimize for power efficiency, latency, and performance—much like how modern vehicles already combine NVIDIA GPUs for high-level perception, Qualcomm SoCs for infotainment, and dedicated MCUs for sensors.

Why This Makes Sense for Different Purposes

Mercedes has demonstrated active use of both chips in separate projects, with complementary strengths that lend themselves to modular deployment:
Akida for Ultra-Low-Power In-Cabin and Edge Inference: Deployed in the 2022 Vision EQXX concept vehicle, the Akida processor excels at always-on, battery-sensitive tasks like keyword spotting ("Hey Mercedes"), voice authentication, and contextual in-cabin monitoring. It achieves 5–10x energy efficiency over traditional AI by processing spiking neural networks (SNNs) at microwatt-to-milliwatt levels, ideal for non-safety-critical, continuous operations that don't drain EV range. BrainChip positions Akida for broader sensor-edge integration, such as with partners like Prophesee for event-based vision.
Loihi 2 for Radar and Real-Time Safety

Processing: Integrated into the ongoing Naomi4Radar project (a German-funded initiative announced in 2024), Loihi 2 focuses on event-driven radar data optimization for forward-facing sensors in ADAS. It enables up to 90% energy reduction and microsecond-latency responses for detecting objects in poor visibility, mimicking brain-like spiking for faster, more adaptive autonomous driving decisions. This is geared toward safety-critical, high-reliability tasks requiring on-chip learning.

These aren't mutually exclusive; Mercedes' research collaborations (e.g., with Intel and the University of Waterloo) emphasize neuromorphic computing's scalability across vehicle subsystems. In a full vehicle stack:
Akida could handle distributed, low-power nodes near microphones and interior cameras.

Loihi 2 could sit closer to radar/LiDAR fusion modules for external perception.
Evidence from Industry Trends
Heterogeneous Integration in AVs: Autonomous vehicles routinely use multi-chip setups today—e.g., NVIDIA's DRIVE platforms combine Orin SoCs with dedicated accelerators for redundancy and specialization. Neuromorphic chips fit this model, with projections for the neuromorphic AV market reaching $59 billion by 2034 (20.7% CAGR), driven by sensor fusion and edge processing. Research from Infosys and IBM highlights neuromorphic modules embedded directly in sensors (e.g., one chip per radar unit), enabling parallel use without central bottlenecks.

Mercedes' Multi-Partner Approach: As of late 2025, Mercedes continues exploring Akida for voice/sensor AI while advancing Loihi in radar R&D, with no reported conflicts. Broader partnerships (e.g., BrainChip with Renesas/Valeo for automotive IP, Intel's Hala Point scaling Loihi) suggest OEMs like Mercedes will mix vendors to avoid single-supplier risks.

Precedents and Future Outlook: While no public confirmation exists for both in one Mercedes vehicle yet (the EQXX remains a concept), 2025 reports from AI News Hub and Global Market Insights note accelerating commercialization, with OEMs like Renault/Bosch adopting multiple neuromorphic types for fleet adaptability. By 2026–2027, as Level 3+ autonomy rolls out in Mercedes' EQS/EQV lines, combined deployment could become standard for 90%+ efficiency gains in data processing.

Challenges like software standardization (e.g., MetaTF for Akida vs. Lava for Loihi) and ASIL safety certification remain, but Mercedes' CTO has publicly endorsed neuromorphic for "breaking new ground" in architectures. If energy and latency demands intensify with EV adoption, this dual-chip strategy could extend range by 10–20% while enhancing safety.
Nice ChatGPT response👍
 
  • Like
Reactions: 2 users

Fiendish

Regular
Someone's triggered 🤣

It's not Chatgpt

Cars are made up of thousands of different components. To think that because a car uses a chip from one company it automatically excludes the car from using tech from any other company is ludicrous.
 
  • Like
  • Haha
Reactions: 3 users

Guzzi62

Regular
Someone's triggered 🤣

It's not Chatgpt

Cars are made up of thousands of different components. To think that because a car uses a chip from one company it automatically excludes the car from using tech from any other company is ludicrous.
But I do agree with your ChatGPT's response, the Akida chips are better for some tasks than Loihi's, so off course you will use different chips in the car if you want the best results. Cars have thousands of chips in them nowadays, electric cars around 3000 so why not deploy some Akida in there?

Keep them ChatGPT's answers coming.

Some interesting info about the Loihi research chip, as they call it. Note: Research! If they plan on making them, they keep a tight lid on it!

 
  • Like
Reactions: 4 users

Rach2512

Regular
  • Like
Reactions: 4 users

TECH

Regular
Would it be possible for car manufacturers to use both akida and loihi in a single vehicle?

Yes, it's not only possible but increasingly likely for a car manufacturer like Mercedes-Benz to integrate both the BrainChip Akida and Intel Loihi 2 neuromorphic processors into a single production vehicle, particularly for specialized edge AI tasks in advanced driver-assistance systems (ADAS), autonomous driving, and in-cabin monitoring.

This approach aligns with the automotive industry's trend toward heterogeneous computing architectures, where multiple specialized chips handle distinct workloads to optimize for power efficiency, latency, and performance—much like how modern vehicles already combine NVIDIA GPUs for high-level perception, Qualcomm SoCs for infotainment, and dedicated MCUs for sensors.

Why This Makes Sense for Different Purposes

Mercedes has demonstrated active use of both chips in separate projects, with complementary strengths that lend themselves to modular deployment:
Akida for Ultra-Low-Power In-Cabin and Edge Inference: Deployed in the 2022 Vision EQXX concept vehicle, the Akida processor excels at always-on, battery-sensitive tasks like keyword spotting ("Hey Mercedes"), voice authentication, and contextual in-cabin monitoring. It achieves 5–10x energy efficiency over traditional AI by processing spiking neural networks (SNNs) at microwatt-to-milliwatt levels, ideal for non-safety-critical, continuous operations that don't drain EV range. BrainChip positions Akida for broader sensor-edge integration, such as with partners like Prophesee for event-based vision.
Loihi 2 for Radar and Real-Time Safety

Processing: Integrated into the ongoing Naomi4Radar project (a German-funded initiative announced in 2024), Loihi 2 focuses on event-driven radar data optimization for forward-facing sensors in ADAS. It enables up to 90% energy reduction and microsecond-latency responses for detecting objects in poor visibility, mimicking brain-like spiking for faster, more adaptive autonomous driving decisions. This is geared toward safety-critical, high-reliability tasks requiring on-chip learning.

These aren't mutually exclusive; Mercedes' research collaborations (e.g., with Intel and the University of Waterloo) emphasize neuromorphic computing's scalability across vehicle subsystems. In a full vehicle stack:
Akida could handle distributed, low-power nodes near microphones and interior cameras.

Loihi 2 could sit closer to radar/LiDAR fusion modules for external perception.
Evidence from Industry Trends
Heterogeneous Integration in AVs: Autonomous vehicles routinely use multi-chip setups today—e.g., NVIDIA's DRIVE platforms combine Orin SoCs with dedicated accelerators for redundancy and specialization. Neuromorphic chips fit this model, with projections for the neuromorphic AV market reaching $59 billion by 2034 (20.7% CAGR), driven by sensor fusion and edge processing. Research from Infosys and IBM highlights neuromorphic modules embedded directly in sensors (e.g., one chip per radar unit), enabling parallel use without central bottlenecks.

Mercedes' Multi-Partner Approach: As of late 2025, Mercedes continues exploring Akida for voice/sensor AI while advancing Loihi in radar R&D, with no reported conflicts. Broader partnerships (e.g., BrainChip with Renesas/Valeo for automotive IP, Intel's Hala Point scaling Loihi) suggest OEMs like Mercedes will mix vendors to avoid single-supplier risks.

Precedents and Future Outlook: While no public confirmation exists for both in one Mercedes vehicle yet (the EQXX remains a concept), 2025 reports from AI News Hub and Global Market Insights note accelerating commercialization, with OEMs like Renault/Bosch adopting multiple neuromorphic types for fleet adaptability. By 2026–2027, as Level 3+ autonomy rolls out in Mercedes' EQS/EQV lines, combined deployment could become standard for 90%+ efficiency gains in data processing.

Challenges like software standardization (e.g., MetaTF for Akida vs. Lava for Loihi) and ASIL safety certification remain, but Mercedes' CTO has publicly endorsed neuromorphic for "breaking new ground" in architectures. If energy and latency demands intensify with EV adoption, this dual-chip strategy could extend range by 10–20% while enhancing safety.

A number of years ago, 3 or 4 from memory we understood at Brainchip the difference when driving in a car at night or during poor visibility, heavy rain, snow showers, rural isolated roads etc to the point that Akida could identify between a dangerous projectile heading towards our windscreen or an innocent plastic bag, now years on we have Tenns, State Space models, Akida 2.0 advancing to Akida 3.0 later in 2026 early 2027 (my opinion)... Mercedes-Benz remain a customer, we compliment their future vision, as this isn't a one fit solution and never has been, technology involves many players coming together and burying their own ego's for the betterment of the customers ultimate product, anything short of that goal is considered to be an arrogant, waste of time and others funds, the guilty party will never be invited to the next Board meeting to sell their technology.

Mercedes-Benz will confirm to Brainchip that they are no longer a customer when they decide, we are still in the game because we keep our mouth shut, can be trusted and have advanced our technology since they first discovered, tested and validated Akida as advanced.

My personal views not the .........

Tech ❤️ Akida
 
Last edited:
  • Like
Reactions: 10 users

Frangipani

Top 20
Are we involved in NAOMI4 radar ? Because I was reading an article that they have lohi on board?


Hi @7für7,

No, we were not involved in the NAOMI4Radar project (1 June 2024 - 31 August 2025), and I never said so.

You evidently no longer have me on ignore (although you’ve repeatedly claimed I were on your ignore list), otherwise you wouldn’t have been able to read my post.
But at the same time it appears you haven’t read any of my other recent posts on this topic, so let me give you a rundown, then.

I have been saying for ages that I do not believe that we will see the release of any Mercedes-Benz serial cars in the near future that already have neuromorphic technology integrated. And if you search for my username in combination with “Mercedes-Benz” via 🔎 , you will find plenty of supporting evidence in favour of my argument.

In early September, a few BRN shareholders were speculating that Akida may have been integrated into the then-about-to-be-released all-new electric Mercedes-Benz GLC. So I reminded everyone that MB had made it abundantly clear that this was not gonna happen, given that in January they had literally stated on LinkedIn: “…we’re currently looking into neuromorphic computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years”. Something that I had also pointed out back in January, when a few forum members got overly excited about the then-upcoming CLA release and were speculating about an Akida reveal with regard to the first Mercedes-Benz vehicle to run entirely on MB.OS.
(https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-449452)


So on 3 September, in the context of speculation about the GLC release, I wrote that the next thing I personally expect to hear from MB with regard to neuromorphic computing will instead be a LinkedIn post by Markus Schäfer sharing with his network the results of the NAOMI4Radar project that evaluated Loihi 2 for radar data processing.

Well, it turns out I wasn’t entirely spot-on, as on 14 October, the reveal of the show car Vision Iconic dropped (“Infused with the latest innovations including Solar Paint, Level 4 automated driving, steer-by-wire, and neuromorphic computing, Vision Iconic perfectly blends beauty and technology, setting new standards for the electric and digital age.”), but a fleeting mention of “neuromorphic computing” was all MB would reveal, nothing specific, such as which event-based sensor and/or neuromorphic processor, what use case etc.

On 26 September, I predicted we would hear from Markus Schäfer about the NAOMI4Radar results by no later than 30 November (as it had become public that he would step down as CTO), which - as you can see - proved to be the case.

I also drew attention to a promotional video that got commissioned in the context of NAOMI4Radar - it was “loved” on LinkedIn by project lead Gerrit Ecke, who has since left MB to join German defense tech start-up Project Q.
This video, however, has not yet been publicly shared by any of the consortium partners.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-475232

1C15782A-A936-4A04-AFDD-D4B7860CD5A5.jpeg


82A0EE97-5ABA-4130-ADEC-8A7E94FEFEBF.jpeg



Shortly after, Mercedes-Benz themselves clarified that the new GLC does not have Akida inside:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-475563


2F3EC72D-062D-4A5A-BBC5-8FC6BA71DBBB.jpeg


419674A8-2A12-49C5-8F87-850619B33767.jpeg


C1481924-D695-4D9A-BB4F-0083D7FA581D.jpeg


Last week, I also shared the link to the official NAOMI4Radar “Schlussberichte” that consortium partners Mercedes-Benz, TWT GmbH Science & Innovation and Uni Lübeck had submitted to the German Federal Ministry they had received public funding from:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-478042

51426F02-1C79-41F8-A263-A8F794DADB6A.jpeg



So as you can see, I am very well aware that the NAOMI4Radar project did not involve Akida.
 
  • Like
  • Wow
  • Fire
Reactions: 9 users

Flenton

Regular
I still remember when LdN said something along the lines of we've found our sweet spot with LIDAR. Seems so long ago now. Back then I definitely thought in 2025 we'd be in a better position than we are. The future still seems very positive but it's taking its bloody time.
 
  • Like
  • Love
Reactions: 12 users

Fiendish

Regular
Its funny reading these forums. 5yrs ago we didn't even know if this tech worked. Now its deployed in space, in a kick ass mercedez concept car and getting utilzed by US defense contractors.

From my very basic understanding of technology, thats the first place you find it being used before it ends up in your refrigerator.

Things seem to be moving along just fine. Lock them away in your super and relax. Let it breathe.
 
  • Like
  • Love
  • Haha
Reactions: 12 users

Diogenese

Top 20
This is impressive:

This new technology allows drones to move heavy loads in a perfectly coordinated manner​


Scientists at Delft University of Technology (TU Delft) in the Netherlands have developed a new algorithm that allows multiple autonomous drones to work together to lift and transport heavy and bulky loads. The system seeks to overcome the capacity limitations of individual drones and is considered ideal for maintaining hard-to-reach infrastructure, such as offshore wind turbines, even in windy conditions.


https://www.msn.com/en-au/news/othe...S&cvid=6921c9e9ed39467395da5401d37eb2d1&ei=33
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Frangipani

Top 20

Pasteur’s Magic Quadrant in AI: The Fusion of Fundamental Research and Practical​


Author: Tony Lewis, CTO BrainChip, Inc.
Tonys-Blog-Graphic.png

In the world of scientific research, few figures embody the fusion of theory and application like Louis Pasteur. His legacy inspired Pasteur’s Quadrant, a framework Donald E. Stokes defined in his 1997 book Pasteur’s Quadrant: Basic Science and Technological Innovation. Stokes showed that the most transformative discoveries arise when research pursues both fundamental understanding and practical impact.

Some research is pursued for pure mental enjoyment or curiosity. Group theory was created in a single night in 1832 by 20-year-old Évariste Galois scribbled in desperation before he faced a fatal duel over a romantic rival. Two hundred years later, it powers the cryptographic algorithms securing global commerce.

In business, research must generate outcomes. But when we fuse deep theory with those outcomes, the impact becomes outsized.

No one illustrates this better than Pasteur. His drive to understand microorganisms produced germ theory of disease and with it, vaccines and pasteurization. These discoveries turned invisible microbes into the root cause of disease, birthing modern medicine and hygiene. They have saved hundreds of millions of lives and remain the foundation of global health systems today. This is the power of theory and application working in unison.

At BrainChip, this philosophy lives in our Research and Development team. We are driven by curiosity. We are guided by purpose.

Our neuromorphic work begins with the brain’s sparse computation. Only about 1% of neurons in the brain are active at any time. BrainChip’s Temporal Event-based Neural Network (TENNs) model introduces a second biological principle: system state. State is a compact representation of everything the network has seen since activation. State must be useful for tasks, easy to compute, and simple to maintain. Traditional AI largely ignores it. Our scientists solved the puzzle of how to make state and to yield enormous benefits.

These two ideas, sparsity and state, form the theoretical foundation of everything we do at BrainChip.

We didn’t wait two centuries for theory to pay off. Below, I share the breakthroughs our team delivered in the summer of 2025. I hope you find them as exciting as I do.

Turning Theory into Tangible Progress​


During the summer of 2025, our Research and Development team achieved several milestones that advanced both the science and business of edge AI. Working at the intersection of hardware and algorithms, we’ve made strides that will shape the future of BrainChip’s Akida neuromorphic platforms and our customers’ success across industries.

Radar and Electronic Warfare: Real-World Defense Applications​


1.-Banner-1-thumbnail.png


We developed and deployed advanced machine learning models for radar and electronic warfare. One key achievement was denoising Synthetic Aperture Radar (SAR) images using Akida 1500 and 2.x hardware. We discovered a method that dramatically reduces the power and latency of the system by orders of magnitude.

We also applied our newest TENN algorithms to classify radar signals, achieving state of the art accuracy, again with orders of magnitude less computation. This algorithm will run on our newest generation of hardware now in development.

Our innovations in radar could help bring complex sensors to lightweight aircraft. It might be part of systems that provides early warning and can confuse hostile actors.

Generative AI at the Edge: Efficiency Meets Creativity​


We’re also bringing the power of Generative AI closer to where data lives. Through innovative quantization techniques, our team achieved near-identical model performance at 4-bit levels versus 32-bit floating point, reducing memory and bandwidth by a remarkable factor of 8 times. This innovative compression technique is patent pending, and its enablement will be baked into new Akida hardware.

Our novel technique uses powerful optimization and a quantizing scheme for each model. These breakthroughs make it possible to run large language models (LLMs), speech recognition, and text-to-speech on low-power devices without sacrificing quality.

We were also successful in implementing our PreCog algorithm which dramatically speeds up retrieval in Retrieval Augmented Generation (RAG) systems, a ubiquitous method for reducing hallucinations and improving accuracy in LLMs.

Transformers vs. State-Space Models: Shaping the Next Generation​


We built a comprehensive simulation framework comparing Transformer and State-Space Model (SSM) architectures that provide critical insights into memory, bandwidth, and power trade-offs. These results directly inform future hardware co-design paving the way for high-performance, low-energy accelerators that bring LLMs and advanced audio processing to the edge.

Anomaly Detection and Health Monitoring: Always-On Intelligence​


6D9767B2-B683-4720-BF52-2D96F5F95907.jpeg



Our work with SSMs also opened new frontiers in time-series anomaly detection for industrial IoT and healthcare. Compact, efficient models can now detect subtle mechanical faults or cardiac irregularities in real time delivering actionable insights where and when they matter most. Our algorithms will run on Akida Pico, a sub milliwatt computing device.

Investing in the Future: People and Potential​


Innovation at BrainChip is powered by people. Our internship program embodies the spirit of Pasteur’s Quadrant bridging theory and practice. This summer, interns contributed breakthroughs in fall detection, large-image segmentation, RF signal classification, and power estimation tools, strengthening our foundation for applied neuromorphic research.

For future employees, this is your invitation: join a team where fundamental science meets real-world impact where ideas evolve into technologies that reshape industries. For customers and investors, it’s proof that our innovation is not just theoretical; it’s operational, delivering measurable results today while paving the path for tomorrow’s breakthroughs.

Looking Ahead​


At BrainChip, we live in Pasteur’s Quadrant where science meets purpose and innovation meets impact.

As hardware and algorithms continue to co-evolve, BrainChip’s Applied Research team remains committed to advancing low-power AI that learns and acts like the brain itself. From defense to healthcare, from the factory floor to autonomous systems, we’re building technologies that make intelligence ubiquitous, efficient, and accessible, one breakthrough at a time.

Author:
Dr. Tony Lewis is the Chief Technology Officer at BrainChip specializing in brain-inspired AI and robotics. Tony was Global Head of the Artificial Intelligence and Emerging Compute Lab at HP, Inc. At Qualcomm, Inc,Tony headed Neuromorphic, Deep Learning and Robotics efforts. Tony has held faculty positions or held leadership positions at UCLA, UIUC, and the Univ. of Arizona. Tony holds a Doctor of Philosophy degree in Electrical Engineering from University of Southern California and a Bachelor of Science in Cybernetics/Applied Math from UCLA.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 23 users

Frangipani

Top 20
Bremen is the place to be this week when you want to engage with customers in the space tech business…


View attachment 93140



6D3367DA-D67A-4C00-B34F-C5A64268FECD.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 15 users

7für7

Top 20
  • Like
  • Fire
  • Love
Reactions: 4 users

IloveLamp

Top 20

1000014045.jpg
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 17 users

7für7

Top 20
Hi @7für7,

No, we were not involved in the NAOMI4Radar project (1 June 2024 - 31 August 2025), and I never said so.

You evidently no longer have me on ignore (although you’ve repeatedly claimed I were on your ignore list), otherwise you wouldn’t have been able to read my post.
But at the same time it appears you haven’t read any of my other recent posts on this topic, so let me give you a rundown, then.

I have been saying for ages that I do not believe that we will see the release of any Mercedes-Benz serial cars in the near future that already have neuromorphic technology integrated. And if you search for my username in combination with “Mercedes-Benz” via 🔎 , you will find plenty of supporting evidence in favour of my argument.

In early September, a few BRN shareholders were speculating that Akida may have been integrated into the then-about-to-be-released all-new electric Mercedes-Benz GLC. So I reminded everyone that MB had made it abundantly clear that this was not gonna happen, given that in January they had literally stated on LinkedIn: “…we’re currently looking into neuromorphic computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years”. Something that I had also pointed out back in January, when a few forum members got overly excited about the then-upcoming CLA release and were speculating about an Akida reveal with regard to the first Mercedes-Benz vehicle to run entirely on MB.OS.
(https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-449452)


So on 3 September, in the context of speculation about the GLC release, I wrote that the next thing I personally expect to hear from MB with regard to neuromorphic computing will instead be a LinkedIn post by Markus Schäfer sharing with his network the results of the NAOMI4Radar project that evaluated Loihi 2 for radar data processing.

Well, it turns out I wasn’t entirely spot-on, as on 14 October, the reveal of the show car Vision Iconic dropped (“Infused with the latest innovations including Solar Paint, Level 4 automated driving, steer-by-wire, and neuromorphic computing, Vision Iconic perfectly blends beauty and technology, setting new standards for the electric and digital age.”), but a fleeting mention of “neuromorphic computing” was all MB would reveal, nothing specific, such as which event-based sensor and/or neuromorphic processor, what use case etc.

On 26 September, I predicted we would hear from Markus Schäfer about the NAOMI4Radar results by no later than 30 November (as it had become public that he would step down as CTO), which - as you can see - proved to be the case.

I also drew attention to a promotional video that got commissioned in the context of NAOMI4Radar - it was “loved” on LinkedIn by project lead Gerrit Ecke, who has since left MB to join German defense tech start-up Project Q.
This video, however, has not yet been publicly shared by any of the consortium partners.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-475232

View attachment 93250

View attachment 93251


Shortly after, Mercedes-Benz themselves clarified that the new GLC does not have Akida inside:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-475563


View attachment 93253

View attachment 93254

View attachment 93255

Last week, I also shared the link to the official NAOMI4Radar “Schlussberichte” that consortium partners Mercedes-Benz, TWT GmbH Science & Innovation and Uni Lübeck had submitted to the German Federal Ministry they had received public funding from:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-478042

View attachment 93252


So as you can see, I am very well aware that the NAOMI4Radar project did not involve Akida.
You’re still on my ignore list. Sometimes there are responses where you’re tagged.. and also there is a tool “show ignored content”
Thank you for the detailed explanation which I did not read…( your posts are always to long and you repeat yourself way to much…) but my post was more metaphorical … I know by myself that there is no Akida involved…

But since we’re here in the BrainChip forum and you only wrote…

“As expected (and predicted), Markus Schäfer has posted one more time about neuromorphic computing in the context of the NAOMI4Radar project before stepping down as the Mercedes-Benz CTO at the end of this month:”

it gives the impression that this is about Akida.
Otherwise you’re usually very detailed, but in this case you were very vague.

How ever… have a nice Sunday..
 
Last edited:
  • Like
Reactions: 1 users

Fiendish

Regular
You’re still on my ignore list. Sometimes there are responses where you’re tagged.. and also there is a tool “show ignored content”
Thank you for the detailed explanation which I did not read…( your posts are always to long and you repeat yourself way to much…) but my post was more metaphorical … I know by myself that there is no Akida involved…

But since we’re here in the BrainChip forum and you only wrote…

“As expected (and predicted), Markus Schäfer has posted one more time about neuromorphic computing in the context of the NAOMI4Radar project before stepping down as the Mercedes-Benz CTO at the end of this month:”

it gives the impression that this is about Akida.
Otherwise you’re usually very detailed, but in this case you were very vague.

How ever… have a nice Sunday..
A simple and novel use for Akida just occurred to me as i was reading your post.

In the English language there are many examples of words which pertain to a common subject and are often mixed up due to their very similar pronunciation and spelling.

A good example would be "sewage" and "sewerage". Both look and sound similar and are often mixed up. Sewage being the waste water from homes and businesses including urine and feces ect and sewerage being the pipes and infastructure that it travels through to reach the treatment plant.

Kind of like your username 7für7 and 7for7 where as you are the 'sewarage' and your posts are 'sewage' and people often get you and your posts mixed up because both you and your posts are 'shit' or 'sheiße'

Happy sunday shitmouth!
 
  • Haha
  • Thinking
  • Like
Reactions: 6 users
Top Bottom