BRN Discussion Ongoing

Flenton

Regular
View attachment 91969

View attachment 91970


Wow — BrainChip’s Akida is featured in the new Mercedes show car! I’m not entirely sure if the Level 4 automated driving system with LiDAR is powered by Akida, but it seems highly likely. Otherwise, BrainChip probably wouldn’t have suddenly published that LiDAR point cloud
Considering Mercedes stated they used Akida technology a few years ago, Akida is the only commercially available technology that can do this type of compute (I think and happy to be corrected), and many many years ago now it was the ceo at the time (Louis DiNardo) who made a comment along the lines of 'we've found our sweet spot with LiDAR' all provide a fairly good argument for why Akida might be being used.
 
  • Like
  • Fire
  • Love
Reactions: 14 users

FJ-215

Regular
Important to remember that we are not alone!!!


Mercedes taps Intel Loihi2 for neuromorphic AI

Mercedes is leading a project in Germany to use neuromorphic computing to improve the performance of forward-facing automotive radar systems.​

Mercedes is using the Intel Loihi 2 neuromorphic event-driven AI processor in the Naomi4radar project, says Markus Schäfer, Member of the Board of Management of Mercedes-Benz Group and Chief Technology Officer for Development & Procurement. Using neuromorphic, event driven algorithms and processors can increase the speed of response of the radar systems, he says.
 
  • Like
  • Sad
Reactions: 8 users

Tothemoon24

Top 20
IMG_1619.jpeg




🌌 Today at EDHPC 2025, Evgenios Tsigkanos presented our approach to Satellite-as-a-Service, orchestrating ML workloads across various edge hardware platforms like a cloud cluster. 🚀
Our demo highlighted applications such as vessel, cloud, flood, and fire detection. We harnessed the power of different hardware accelerators, including Xilinx Kria DPU, BrainChip Akida, Coral TPU, and Myriad 2, for representative on-board inference and performance. Our in-house AI models, along with compatibility for third-party models, enable on-board inference, reducing latency for quicker decisions, conserving bandwidth by transmitting processed data, and ensuring operation through communication gaps.
We extend our gratitude to our industry and academic collaborators, as well as ESA, whose efforts helped us navigate the hardware landscape.

📅 Join us at our next presentation on Thursday at 14:40 in the Conference Room, where Evgenios Tsigkanos will present our work on "Synthetic Data Generators for Enhanced Space-based Network Traffic Modeling." We look forward to exchanging insights wit


IMG_1618.jpeg
 
  • Like
  • Fire
Reactions: 9 users

Frangipani

Top 20
The revised GIASaaS (Global Instant Satellite as a Service, formerly Greek Infrastructure for Satellite as a Service) concept website by OHB Hellas is finally back online - and Akida now no longer shows up as UAV1 (see my post above) but as one of three SATs! 🛰

AKIDA, the envisioned SAT with the AKD1000 PCIe Card onboard, is slated to handle both Object Detection as well as Satellite Detection, while the planned KRIA and CORAL satellites (equipped with a Xilinx KRIA KV260 Vision AI Starter Kit resp. a Google Coral TPU Dev Board) are tasked to handle Vessel Detection, Cloud Detection and Fire Detection (for some reason OHB Hellas removed Flood Detection from the list of applications).

Please note that this is currently a concept website only.

The idea behind GIASaaS as well as mock 3D-printed versions of the yet-to-be-lauched satellites were presented at DEFEA 2025 - Defense Exhibition Athens earlier this month, as @Stable Genius had spotted on LinkedIn:







View attachment 84868 View attachment 84869 View attachment 84870

View attachment 84867

Evgenios Tsigkanos from OHB Hellas presented GIASaaS 👆🏻 at EDHPC 2025 today:



3AC418CF-31B1-4B50-A238-EED96A3CD376.jpeg



Gilles Bézard and Alf Kuchenbuch were also going to refer to GIASaaS in their two-part tutorial at EDHPC 2025 yesterday:

Here is more info on the two-part tutorial Gilles Bézard and Alf Kuchenbuch will be presenting on 13 October at EDHPC25 in Elche, Spain:

[All the weird extra os under “Use cases for Neuromorphic Processing in space” are evidently bullet points gone rogue in the website layout… When I copy and paste that passage from below website, the combination “space:” followed by “o” shows up as “space😱” here on TSE 😂]



Neuromorphic AI in Space Tutorials (by BrainChip)

Introduction

In space applications, every milliwatt matters. Satellites rely on ultra-low-power chips to process data on-board, where sending everything back to Earth is often impossible or inefficient. This makes efficient machine learning deployment on embedded hardware not just useful, but essential.

The goal of the tutorials is to show how advanced AI capabilities can be packed into tiny, power-constrained devices, enabling smarter, faster, and more autonomous satellite systems.

Audience

Project managers (particularly part 1) and engineers working on embedded low-power AI applications.
Level: from beginner to expert.


What will you learn?

In part 1: Akida in Space – Bringing Autonomy, Robustness and Efficient Data Transmission to Space Vehicles, you will learn:

  • Why AI in Space?
  • Why neuromorphic AI in Space?
  • BrainChip Akida IP is the only Event-based AI on the commercial market – IP and silicon
  • Use cases for Neuromorphic Processing in space:eek: Lunar landingo Docking in spaceo Earth observationo Space Situational Awarenesso Satellite detection (use case OHB Giasaas)

In part 2: Bringing AI to the Edge – End-to-End Machine Learning Deployment on an Embedded Low-Power AI Neuromorphic HW, you will learn:

  • Fetch and prepare a dataset suitable for your target application.
  • Design and train a machine learning model using TF/Keras.
  • Apply quantization techniques to reduce model size and optimize it for embedded hardware.
  • Convert the trained model into a format compatible with Akida hardware toolchain.
  • Export the model as a C source file that can be included and compiled together with your C main application.
  • Integrate the Akida model binary into a baremetal C application running on a STM32 microcontroller.
  • Run inference on Akida ultra-low-power hardware device, demonstrating efficient on-board processing for space-constrained environments such as satellites.

Outline of hands-on tutorial
In the first tutorial, we teach why there is a paradigm shift in Space from purely deterministic classic programming to the use of low power AI in specific use cases. We show the vast improvement in capabilities coming with the use of AI in Space.

In the second tutorial, we demonstrate how to take an ML model from dataset preparation all the way to baremetal deployment on a microcontroller with a hardware accelerator. By the end, you will know how to take an ML project from concept to fully optimized embedded deployment, step by step.

Speakers:
  • Gilles Bézard (BrainChip) – Part 1 & 2
  • Alf Kuchenbuch (BrainChip) – Part 1




As you can see, Gilles Bézard and Alf Kuchenbuch will also be referring to the GIASAAS (Global Instant Satellite as a Service) concept by OHB Hellas, which I first posted about yesterday a year ago…

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-438109

… and gave an update on in May:



To the best of my knowledge, this is the first time BrainChip will be making a reference to GIASAAS.
 
  • Like
  • Love
Reactions: 7 users

Terroni2105

Founding Member
View attachment 91961
View attachment 91962


Learning
Thanks Learning.
hopefully no idiot shareholder will go on there and leave an embarrassing comment about Brainchip.
 
  • Like
  • Fire
  • Love
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Love
  • Fire
  • Like
Reactions: 4 users

Diogenese

Top 20
Important to remember that we are not alone!!!


Mercedes taps Intel Loihi2 for neuromorphic AI

Mercedes is leading a project in Germany to use neuromorphic computing to improve the performance of forward-facing automotive radar systems.​

Mercedes is using the Intel Loihi 2 neuromorphic event-driven AI processor in the Naomi4radar project, says Markus Schäfer, Member of the Board of Management of Mercedes-Benz Group and Chief Technology Officer for Development & Procurement. Using neuromorphic, event driven algorithms and processors can increase the speed of response of the radar systems, he says.

Hi FJ,

The Mercedes Loihi 2 news is from 2024, and I think we've seen it before, so hopefully it won't cause too much despair.

Mercedes Startup Autobahn member Zendar has a 2025 project to provide radar with lidar precision, one advantage being that radar is less affected by rain.
https://expo2025.pnptc.events/airta...25-project_teaser-mercedes_benz-zendar-v1.pdf



recgpe3o3xc5phy4u_20250602-expo2025-project_teaser-mercedes_benz-zendar-v1.pdf

MERCEDES-BENZ X ZENDAR

Lidar-level Precision through Software-defined Radar for ADAS

Zendar could enable Mercedes-Benz to develop enhanced automated driving features and expand its Operational Design Domain (ODD) with a scalable, software-defined radar solution.



Espacenet – search results

This patent sounds a bit like some sort of version of micro-Doppler:

WO2020222948A1 SYSTEMS AND METHODS FOR COMBINING RADAR DATA 20190430

1760447927860.png


[0074] … The processor 140 may be configured to aggregate and process the second sets of radar signals received respectively from the plurality of radar modules, by

(a) coherently combining the second sets of radar signals,

(b) computing a property of a target, or

(c) generating an occupancy grid or a radar image, using at least

(i) phase information associated with the second sets of radar signals and

(ii) timestamp information associated with the second sets of radar signals.

In some cases, the processor 140 may be operatively coupled to and in communication with the frequency generator 110 and the timing module 120. In such cases, the processor 140 may be configured to provide feedback data to the frequency generator 110 and the timing module 120. The frequency generator 110 and the timing module 120 may be configured to receive feedback data from the processor 140. The feedback data may comprise one or more signals derived in part from the second sets of radar signals received by the processor 140 from the plurality of radar modules. The frequency generator 110 and the timing module 120 may be configured to use the feedback data from the processor 140 to adjust, correct, and/or modify the reference frequency signal, the shared clock signal, and/or a timing signal of the plurality of timing signals
.


US12360202B1 System for centralized processing of loosely synchronized radars 20240112

1760446083812.png


[0003] Embodiments of the present disclosure are generally directed to systems and methods that determine a three-dimensional (3D) scene representation of an environment based on a by processing incoming radar signals based on a model fitting approach.

[0004] Recognized herein are various limitations with radar systems currently available. In order to take advantage of radar, vehicles may be equipped with multiple radar and other type sensors to detect obstacles and objects in the surrounding environment. However, the multiple radar sensors in current radar systems may typically process data independently of one another. Provided herein are systems and methods for processing and combining radar data as well as data received from other sensors (e.g., imaging, LIDAR, and the like). The performance and robustness of radar systems may be improved by combining data from multiple sensors or modules prior to the perception, detection, or classification of objects or obstacles in a surrounding environment. Further, the radar systems disclosed herein may be configured to resolve computational ambiguities involved with processing and coherently combining radar data from multiple radar sensors or modules, in order to identify nearby objects or obstacles and generate one or more local maps of the surrounding environment
.

Neuromorphic processors and ML are somethings they've heard of, but don't seem to know much about.

[0157] As described above, machine learning algorithms are employed herein to build a model to determine a set of output predictions for the environment. Examples of machine learning algorithms may include a support vector machine (SVM), a naïve Bayes classification, a random forest, a neural network, deep learning, or other supervised learning algorithm or unsupervised learning algorithm for classification and regression. The machine learning algorithms may be trained using one or more training datasets. For example, previously received contextual data may be employed to train various algorithms. Moreover, as described above, these algorithms can be continuously trained/retrained using real-time user data as it is received. In some embodiments, the machine learning algorithm employs regression modeling where relationships between variables are determined and weighted. In some embodiments, the machine learning algorithms employ regression modeling, wherein relationships between predictor variables and dependent variables are determined and weighted.

It looks like Akida could help Zendar, but maybe Mercedes will push them towards Loihi 2 or Eliasmith.
 
  • Sad
  • Like
Reactions: 2 users

Frangipani

Top 20
Important to remember that we are not alone!!!


Mercedes taps Intel Loihi2 for neuromorphic AI

Mercedes is leading a project in Germany to use neuromorphic computing to improve the performance of forward-facing automotive radar systems.​

Mercedes is using the Intel Loihi 2 neuromorphic event-driven AI processor in the Naomi4radar project, says Markus Schäfer, Member of the Board of Management of Mercedes-Benz Group and Chief Technology Officer for Development & Procurement. Using neuromorphic, event driven algorithms and processors can increase the speed of response of the radar systems, he says.

Totally agree, @FJ-215!

I personally think it’s highly likely that the reference to “neuromorphic” in the description of the Mercedes-Benz Vision Iconic show car revealed today relates to the NAOMI4Radar project that was wrapped up in August.

According to project partner TWT GmbH Science & Innovation, the goal of NAOMI4Radar was “to optimize radar data processing in autonomous vehicles using #NeuromorphicComputing” and “Within the approximately one-year project duration, we aim to demonstrate the industrial applicability of the neuromorphic chip #Loihi2.”

cf. https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-473794


By the way: The NAOMI4Radar project was not fully funded by the German government. What was formerly known as Germany’s Federal Ministry for Economic Affairs and Climate Action apparently funded 56% of the 900,000 € project costs, so its total contribution must have amounted to 504,000 €, then. The document below does not state how much was paid to each project partner and when, but we know that Mercedes-Benz received between 20,000 and 30,000 € of this payment for the calendar year 2024 (see my link above).



133E6D7B-C9F8-4C63-BFBC-14C33468CA8C.jpeg




I recently posted that I was expecting Markus Schäfer to share the results of the NAOMI4Radar project before stepping down as CTO in about six weeks from now and that Mercedes-Benz even had a promotional video commissioned:

Can’t be long now, given that Markus Schäfer has less than 10 weeks left as CTO before passing the baton on to his successor Jörg Burzer.

Turns out Mercedes-Benz even had a promotional video commissioned to wrap up the NAOMI4Radar project, starring the S-Class test vehicle that combined an Infineon front radar with Loihi 2.

Project lead (and supporting actor) Gerrit Ecke from MB’s neuromorphic research team and the interdisciplinary innovation department “PIONEERING NeXt” loves the production company’s recent post about their day out filming on the test track:


View attachment 91531


View attachment 91533

So stay tuned…

My above post was dated 26 September.

Meanwhile, neuromorphic researcher Gerrit Ecke, who was the project lead at Mercedes-Benz for the NAOMI4Radar project and was also featured in the video referred to above, has left the German luxury carmaker and embarked on a new adventure as Principal Engineer AI / ML with Project Q, a defense-tech company founded in 2024 with offices in Munich and Berlin. Interesting startup…



AFCBFD75-21A2-4A06-BA9E-1F0B1F7A10D2.jpeg



880A3CBE-6998-49FC-8B57-04382CD8C1AD.jpeg


B8B8B16C-EF25-495D-945F-E0233700D717.jpeg





D3E8FAB5-C136-4F53-873F-1E3A2ECBF5EE.jpeg



 
  • Like
Reactions: 1 users

Frangipani

Top 20
EDHPC25, the European Data Handling & Processing Conference for space applications organised by ESA, is less than three weeks away.

Turns out BrainChip will not only be exhibiting, but that Gilles Bézard and Alf Kuchenbuch will also be presenting a two-part tutorial on 13 October:



View attachment 91405


Adam Taylor from Adiuvo Engineering, who recently posted about ORA,“a comprehensive sandbox platform for edge AI deployment and benchmarking” (whose edge devices available for evaluation also include Akida) will be presenting concurrently.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-474283

Three days later, on 16 October, ESA’s Laurent Hili will be chairing a session on AI Engines & Neuromorphic, which will include a presentation on Frontgrade Gaisler’s GRAIN product line, whose first device, the Gr801 SoC - as we know - will combine Frontgrade Gaisler’s NOEL RISC-V processor and Akida.

View attachment 91406



View attachment 91408

The session’s last speaker will be AI vision expert Roland Brochard from Airbus Defense & Space Toulouse, who has been involved in the NEURAVIS proposal with us for the past 18 months or so:


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-429615

View attachment 91409


In April, Kenneth Östberg’s poster on GRAIN had revealed what NEURAVIS actually stands for: Neuromorphic Evaluation of Ultra-low-power Rad-hard Acceleration for Vision Inferences in Space.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-459275

View attachment 91410


It appears highly likely that Roland Brochard will be giving a presentation on said NEURAVIS proposal in his session, given that a paper co-authored by 15 researchers from all consortium partners involved (Airbus Toulouse, Airbus Ottobrunn, Frontgrade Gaisler, BrainChip, Neurobus, plus ESA) was uploaded to the conference website.

This paper has the exact same title as Roland Brochard’s EDHPC25 presentation, namely “Evaluation of Neuromorphic computing technologies for very low power AI/ML applications” (which is also almost identical to the title of the original ESA ITT, except that the words “…in space” are missing, cf. my July 2024 post above):


View attachment 91411


Full paper in next post due to the upload limitation of 10 files per post…

Daniel Andersson, Project Manager at Frontgrade Gaisler, will be the one presenting “GRAIN – towards event-based AI in space” at EDHPC 2025 tomorrow:


9688E9AA-5C50-4AD1-83CA-7FCED119920E.jpeg
 
Last edited:

Frangipani

Top 20
😳

Mercedes-Benz spins out Silicon Valley chip group into new company​

f9a7aded3755d91883d0d3c77157b2f6



By Stephen Nellis

SAN FRANCISCO (Reuters) -Mercedes-Benz (MBG.DE) on Friday spun out into a new company a group of chip experts in Silicon Valley that is working on creating a new generation of computing brains for self-driving cars, drones and other vehicles.

Athos Silicon, based in Santa Clara, California, will house a group of engineers who for five years worked at Mercedes-Benz Research & Development North America to develop the new chips, which aim to be safe enough for use in cars while using less energy than existing chips.


As part of the spinout, Athos is receiving intellectual property developed by the group and what Mercedes-Benz described as a "significant" investment, though neither the carmaker nor Athos disclosed the value of the transaction.

For chips used in cars, reliability is key, so critical self-driving functions are often handled by two or more separate chips in order to have backups in case of a failure. The Athos team developed a way to get the same kind of reliability using "chiplets," which are tiny pieces of chips that can be bound together in a single package.

Keeping the chips in a single package can use 10 to 20 times less power than having separate chips that must communicate with one another across a circuit board, Athos Silicon Chief Executive Charnjiv Bangar said in an interview on Friday. Those power savings are important in electric vehicles where the car's computing brains must compete with its wheels for limited battery power.


"For an electric future, electricity is a new currency," Bangar said.

Athos Silicon intends to raise venture capital from other investors. Bangar declined to disclose Mercedes-Benz's precise stake, but said the carmaker will be a minority shareholder and the chip firm will have an independent board.

"Independence is important for Athos, so that we can reach out to other (carmakers), competitors of Mercedes. We need to make sure we have a neutral approach," Bangar said.

(Reporting by Stephen Nellis in San FranciscoEditing by Nick Zieminski)


According to Athos Silicon Co-Founder and former Mercedes-Benz North America mSoC Chief Architect François Piednoël, Akida does not pass the minimum requirements for Level 3 or Level 4 automated driving.


999663B0-3FD9-43F6-88D1-DB2DF6E4D0B6.jpeg
 
Top Bottom