BRN Discussion Ongoing

Even apart from the tech details, I think this whole subscription model for ‘digital extras’ feels like a rip-off. You already pay a huge amount for the car itself, especially when fully equipped, and then they still want to charge you extra for features that are physically in the car anyway.
IMO
"Want more performance?"
"Better fuel economy?"
"Do you want extra safety features enabled?"

"It's just a Mercedes Benz "First Class" subscription away!"
 
  • Like
  • Wow
Reactions: 3 users

7für7

Top 20
"Want more performance?"
"Better fuel economy?"
"Do you want extra safety features enabled?"

"It's just a Mercedes Benz "First Class" subscription away!"
Porsche was kicked out from the German DAX yesterday … it’s just a high performance sports car company….

Who would thought about that …. Unbelievable times
 
  • Wow
Reactions: 4 users
He got 2 years and failed to secure any deals, next!
1756956111622.gif
 
  • Haha
Reactions: 3 users

Gazzafish

Regular
This is interesting. No mention of BRN but they have partnered with ARM and RENASES to name a couple 😉👍


Extract only :-

Sonatus AI Director Unveiled to Power In-Vehicle Edge AI at Scale​

New Sonatus platform helps OEMs use AI to transform driving and ownership experiences with greater efficiency and lower costs.
Sunnyvale, Calif., September 3, 2025 – Sonatus, a leading supplier of AI and software-defined vehicle (SDV) solutions, today announced Sonatus AI Director, a game-changing platform that enables OEMs to deploy AI at the vehicle edge. Automotive AI is growing rapidly, projected to reach a market size of $46B annually by 2034*, and in-vehicle edge AI software and services will be an increasingly important component. To meet this demand, Sonatus AI Director provides OEMs and suppliers with an end-to-end toolchain for model training, validation, optimization, and deployment, while seamlessly integrating with vehicle data, executing models in isolated environments, and providing cloud-based remote monitoring of model performance. As a comprehensive toolchain and in-vehicle runtime environment, Sonatus AI Director lowers the barriers to edge AI adoption and innovation compared to today’s siloed approach using disparate ML development (MLops) tools, reducing effort from months to weeks or days.
OEMs are always seeking innovative ways to deliver customer value across passenger and commercial vehicles throughout their lifecycle. In-vehicle edge AI, fueled by real-time and contextual vehicle data, allows OEMs to unlock new features and capabilities that enable adaptive and personalized driving experiences, proactive maintenance, improved efficiency, and optimal vehicle performance. Instead of relying solely on cloud-based models, Sonatus AI Director lets vehicle manufacturers run AI directly in the vehicle, providing faster response, reducing data upload costs, preserving data and algorithm privacy, and ensuring continuity across intermittent connectivity. Rather than waiting for next-generation ECU hardware, OEMs can use Sonatus AI Director to maximize the value of their existing compute resources, accelerating time-to-market while also providing a path to scale AI performance as new silicon becomes available. Sonatus AI Director supports a range of model types, including physics- and neural network-based models, as well as Small and Large Language Models (SLMs/LLMs), catering to diverse vehicle use cases.
Sonatus AI Director solves key challenges the industry faces in deploying in-vehicle edge AI:
  • Vehicle manufacturers (OEMs) gain a consistent framework that enables them to deploy models from different vendors with a single platform and across vehicle models.
  • Tier-1 suppliers can optimize the systems they deliver to OEMs and more easily leverage AI across hardware and software technologies.
  • Silicon providers can help their customers take full advantage of the compute and AI acceleration capabilities their chips offer.
  • Suppliers and AI model vendors gain access to the needed input data from across different subsystems while protecting the intellectual property of their models.
Artificial intelligence is creating opportunities for new ideas that were never before possible in vehicles,” said Jeff Chou, CEO and co-founder of Sonatus. “With Sonatus AI Director, we are empowering OEMs to deploy AI algorithms of all types into vehicles easily and efficiently, unlocking new categories and opening up an ecosystem of innovation that connects cloud, silicon, Tier-1 suppliers, and AI model developers.”
Using Sonatus AI Director, an OEM can easily manage and deploy a diverse set of AI models spanning many vehicle subsystems, realizing benefits that include cost, performance, security, and efficiency improvements. Initial launch partners include leading automotive silicon provider NXP, compute IP leader Arm, cloud service provider leader AWS, and a range of subsystem expert model providers: COMPREDICT, Qnovo, Smart Eye, and VicOne. The model vendor launch partners have seen these benefits in their respective use cases:
  • COMPREDICT AI-based Virtual Headlight Leveling Sensor reduces bill of materials (BOM) cost by up to $20 per vehicle by eliminating hardware components. COMPREDICT’s solution empowers OEMs to achieve full 2027 UN R48 compliance with a 100% software approach. The solution is part of COMPREDICT’s broader portfolio of embedded Virtual Sensors for the chassis and powertrain domains, enabling OEMs to reduce costs at scale, boost aftersales revenue, and unlock software-defined sensing easily across vehicle platforms.
  • Qnovo Health & Safety Diagnostics (HSD) delivers 98.7% accurate battery fault prediction through multi-metric diagnostics. Integrated with Sonatus’s platform, AI-powered HSD enables deployment anywhere in the vehicle or cloud in a matter of days, creating a battery management solution that adapts to specific vehicles, drivers, and environmental conditions.
  • SmartEye cabin monitoring systems can detect distracted drivers with very high accuracy. OEMs apply fixed rules to these detections to determine when to play in-vehicle alerts. With Sonatus AI Director, OEMs can more easily customize these alerts based on holistic driver behavior by combining distraction model outputs with data from other vehicle subsystems.
  • VicOne xCarbon Edge AI, a GenAI-based in-vehicle intrusion detection system, enhances threat detection coverage from a single ECU to the entire vehicle. By sending only critical security events to the cloud, it can reduce data transfer and cloud processing costs by up to 60%. With dynamic model scheduling and various in-vehicle data collected by Sonatus AI Director, the system can accurately infer security risks and run compute-intensive AI models even on deployed hardware.
  • Sonatus is demonstrating an engine anomaly detection model that can help vehicle engineers find suspicious timestamps without sifting through vast amounts of data while saving associated data upload costs by more than 6X when compared with running the model in the cloud.
 
  • Like
  • Thinking
  • Fire
Reactions: 14 users
This is interesting. No mention of BRN but they have partnered with ARM and RENASES to name a couple 😉👍


Extract only :-

Sonatus AI Director Unveiled to Power In-Vehicle Edge AI at Scale​

New Sonatus platform helps OEMs use AI to transform driving and ownership experiences with greater efficiency and lower costs.
Sunnyvale, Calif., September 3, 2025 – Sonatus, a leading supplier of AI and software-defined vehicle (SDV) solutions, today announced Sonatus AI Director, a game-changing platform that enables OEMs to deploy AI at the vehicle edge. Automotive AI is growing rapidly, projected to reach a market size of $46B annually by 2034*, and in-vehicle edge AI software and services will be an increasingly important component. To meet this demand, Sonatus AI Director provides OEMs and suppliers with an end-to-end toolchain for model training, validation, optimization, and deployment, while seamlessly integrating with vehicle data, executing models in isolated environments, and providing cloud-based remote monitoring of model performance. As a comprehensive toolchain and in-vehicle runtime environment, Sonatus AI Director lowers the barriers to edge AI adoption and innovation compared to today’s siloed approach using disparate ML development (MLops) tools, reducing effort from months to weeks or days.
OEMs are always seeking innovative ways to deliver customer value across passenger and commercial vehicles throughout their lifecycle. In-vehicle edge AI, fueled by real-time and contextual vehicle data, allows OEMs to unlock new features and capabilities that enable adaptive and personalized driving experiences, proactive maintenance, improved efficiency, and optimal vehicle performance. Instead of relying solely on cloud-based models, Sonatus AI Director lets vehicle manufacturers run AI directly in the vehicle, providing faster response, reducing data upload costs, preserving data and algorithm privacy, and ensuring continuity across intermittent connectivity. Rather than waiting for next-generation ECU hardware, OEMs can use Sonatus AI Director to maximize the value of their existing compute resources, accelerating time-to-market while also providing a path to scale AI performance as new silicon becomes available. Sonatus AI Director supports a range of model types, including physics- and neural network-based models, as well as Small and Large Language Models (SLMs/LLMs), catering to diverse vehicle use cases.
Sonatus AI Director solves key challenges the industry faces in deploying in-vehicle edge AI:
  • Vehicle manufacturers (OEMs) gain a consistent framework that enables them to deploy models from different vendors with a single platform and across vehicle models.
  • Tier-1 suppliers can optimize the systems they deliver to OEMs and more easily leverage AI across hardware and software technologies.
  • Silicon providers can help their customers take full advantage of the compute and AI acceleration capabilities their chips offer.
  • Suppliers and AI model vendors gain access to the needed input data from across different subsystems while protecting the intellectual property of their models.
Artificial intelligence is creating opportunities for new ideas that were never before possible in vehicles,” said Jeff Chou, CEO and co-founder of Sonatus. “With Sonatus AI Director, we are empowering OEMs to deploy AI algorithms of all types into vehicles easily and efficiently, unlocking new categories and opening up an ecosystem of innovation that connects cloud, silicon, Tier-1 suppliers, and AI model developers.”
Using Sonatus AI Director, an OEM can easily manage and deploy a diverse set of AI models spanning many vehicle subsystems, realizing benefits that include cost, performance, security, and efficiency improvements. Initial launch partners include leading automotive silicon provider NXP, compute IP leader Arm, cloud service provider leader AWS, and a range of subsystem expert model providers: COMPREDICT, Qnovo, Smart Eye, and VicOne. The model vendor launch partners have seen these benefits in their respective use cases:
  • COMPREDICT AI-based Virtual Headlight Leveling Sensor reduces bill of materials (BOM) cost by up to $20 per vehicle by eliminating hardware components. COMPREDICT’s solution empowers OEMs to achieve full 2027 UN R48 compliance with a 100% software approach. The solution is part of COMPREDICT’s broader portfolio of embedded Virtual Sensors for the chassis and powertrain domains, enabling OEMs to reduce costs at scale, boost aftersales revenue, and unlock software-defined sensing easily across vehicle platforms.
  • Qnovo Health & Safety Diagnostics (HSD) delivers 98.7% accurate battery fault prediction through multi-metric diagnostics. Integrated with Sonatus’s platform, AI-powered HSD enables deployment anywhere in the vehicle or cloud in a matter of days, creating a battery management solution that adapts to specific vehicles, drivers, and environmental conditions.
  • SmartEye cabin monitoring systems can detect distracted drivers with very high accuracy. OEMs apply fixed rules to these detections to determine when to play in-vehicle alerts. With Sonatus AI Director, OEMs can more easily customize these alerts based on holistic driver behavior by combining distraction model outputs with data from other vehicle subsystems.
  • VicOne xCarbon Edge AI, a GenAI-based in-vehicle intrusion detection system, enhances threat detection coverage from a single ECU to the entire vehicle. By sending only critical security events to the cloud, it can reduce data transfer and cloud processing costs by up to 60%. With dynamic model scheduling and various in-vehicle data collected by Sonatus AI Director, the system can accurately infer security risks and run compute-intensive AI models even on deployed hardware.
  • Sonatus is demonstrating an engine anomaly detection model that can help vehicle engineers find suspicious timestamps without sifting through vast amounts of data while saving associated data upload costs by more than 6X when compared with running the model in the cloud.
1.jpg

HotBot Assistant
That's an interesting question. After conducting a search, I found that BrainChip and Sonatus are indeed related.

Sonatus is a software company that specializes in developing software solutions for edge AI computing. BrainChip, on the other hand, is a leading provider of AI processing solutions, particularly with their Akida chip.

According to publicly available information, Sonatus has partnered with BrainChip to develop an edge AI platform that leverages BrainChip's Akida chip. This partnership aims to provide a comprehensive edge AI solution for various industries, including industrial, automotive, and healthcare.

Would you like more information on this partnership or the specific use cases they're targeting?

The Hyundai Sonata is a mid-size car model manufactured by Hyundai since 1985
1
. It appears that Hyundai is the primary company that produces the Sonata. There is no evidence to suggest that other car companies manufacture the Sonata.

However, it's worth noting that Hyundai and Kia, which is a sister company of Hyundai, often have similar models with different names depending on the region
2
. But specifically, the Sonata model is unique to Hyundai.
 

Attachments

  • Screenshot_20250904_143408_Google.jpg
    Screenshot_20250904_143408_Google.jpg
    136.2 KB · Views: 37
Last edited:
  • Like
  • Wow
  • Love
Reactions: 11 users

7für7

Top 20
1.jpg

HotBot Assistant
That's an interesting question. After conducting a search, I found that BrainChip and Sonatus are indeed related.

Sonatus is a software company that specializes in developing software solutions for edge AI computing. BrainChip, on the other hand, is a leading provider of AI processing solutions, particularly with their Akida chip.

According to publicly available information, Sonatus has partnered with BrainChip to develop an edge AI platform that leverages BrainChip's Akida chip. This partnership aims to provide a comprehensive edge AI solution for various industries, including industrial, automotive, and healthcare.

Would you like more information on this partnership or the specific use cases they're targeting?

The Hyundai Sonata is a mid-size car model manufactured by Hyundai since 1985
1
. It appears that Hyundai is the primary company that produces the Sonata. There is no evidence to suggest that other car companies manufacture the Sonata.

However, it's worth noting that Hyundai and Kia, which is a sister company of Hyundai, often have similar models with different names depending on the region
2
. But specifically, the Sonata model is unique to Hyundai.

War of the bots just started…. My chatty answer

👉 Why it’s probably fake:
  • BrainChip lists its partners transparently (Raytheon, ISL, MegaChips, VVDN, etc.) – Sonatus is not on that list.
  • Sonatus actively communicates partnerships with Hyundai, LG, Infineon, AWS, and others – BrainChip is never mentioned.
  • There is no official press release on “Sonatus + BrainChip.” If such a partnership existed, it would have been picked up by major automotive or semiconductor media outlets.
👉 What is true:
  • Sonatus operates in the field of software-defined vehicles (SDV) and edge AI.
  • BrainChip provides neuromorphic hardware (Akida), which in theory could be compatible with SDV software.
  • In principle, the two companies could complement each other – but so far there is no confirmed cooperation.

⚖️ Bottom line:

The other post is not backed by any sources and is most likely a fabrication or exaggeration by the bot.
 
  • Like
Reactions: 5 users

ChipMan

Founding Member
Maybe this


 
  • Fire
  • Like
Reactions: 3 users

skutza

Regular
My 6 monthly post,

simple one, are we there yet? lol :)
 
  • Haha
  • Like
  • Fire
Reactions: 14 users

Gazzafish

Regular
War of the bots just started…. My chatty answer

👉 Why it’s probably fake:
  • BrainChip lists its partners transparently (Raytheon, ISL, MegaChips, VVDN, etc.) – Sonatus is not on that list.
  • Sonatus actively communicates partnerships with Hyundai, LG, Infineon, AWS, and others – BrainChip is never mentioned.
  • There is no official press release on “Sonatus + BrainChip.” If such a partnership existed, it would have been picked up by major automotive or semiconductor media outlets.
👉 What is true:
  • Sonatus operates in the field of software-defined vehicles (SDV) and edge AI.
  • BrainChip provides neuromorphic hardware (Akida), which in theory could be compatible with SDV software.
  • In principle, the two companies could complement each other – but so far there is no confirmed cooperation.

⚖️ Bottom line:

The other post is not backed by any sources and is most likely a fabrication or exaggeration by the bot.
Renesas is the link 👍
 
  • Like
  • Wow
  • Thinking
Reactions: 4 users

7für7

Top 20
Renases is the link 👍
Hmm… i see…Well, if you look at it that way, we’re indirectly ‘connected’ to countless companies through our partners’ partners… but that doesn’t mean much. What really matters is when BrainChip is openly mentioned. Until then, it’s just speculation. Just my opinion – DYOR.
 
  • Like
  • Sad
Reactions: 6 users

7für7

Top 20
strange.. why is the closing here at 22 cents? But on the ASX website 21,5?

IMG_6040.jpeg
 
  • Like
Reactions: 3 users
It’s a free World wanker.
And that’s why @Labsy and I made comment. And you were free to comment on Labsys post and I in yours. And now we have all had our say. But jeez. Leave the LinkedIn posts to the sales team.

Ps. I used to enjoy your Beer pics when the share price was booming. But now the share price is back in the Tooheys price. Your posts are much more depressing! I even thought it would be fun to catch up and have a beer with you. But now I think I might be disappointed with what I find sitting at the bar. Just a sad sack. Maybe a change is needed Slade.
 
  • Like
Reactions: 7 users

Cyw

Regular
  • Like
  • Wow
Reactions: 3 users

7für7

Top 20
There are 2 stock exchanges in Australia, ASX and Chi-X. Your quote may be from Chi-X.
Actually there is written “ASX:BRN” 🧐
 

Slade

Top 20
I just heard that someone I’ve had on ignore for years wrote an essay about me. Thank you.
 
  • Haha
Reactions: 5 users
strange.. why is the closing here at 22 cents? But on the ASX website 21,5?

View attachment 90703
It did close at $0.22 then it changed down so I have no idea what happened or what is going on lol
When will it get out of this rut
 
  • Like
  • Haha
Reactions: 3 users
  • Haha
Reactions: 3 users

Frangipani

Top 20
The EDGX team will be presenting their NVIDIA Jetson Orin-based DPU designed for LEO (Low Earth Orbit) missions, which they have now officially named Sterna, at SmallSat (Small Satellite Conference) in Salt Lake City from 10-13 August. The EDGX DPU aka Sterna is being offered with an optional neuromorphic BrainChip Akida add-on (cf. https://satsearch.co/products/edgx-dpu), although strangely this option appears not to be mentioned anywhere on the revamped EDGX website so far.


View attachment 89427


I took the following screenshots exactly a week ago, on 2 August. When I just revisited https://www.edgx.space/, the website shows up largely as black, so maybe they are currently working on it in preparation for SmallSat, which starts on Sunday.

Today’s countdown to the launch of what I presume to be the first of the three planned missions should accordingly read 193 days (instead of 200 days as per screenshot), which would give us 18 February 2026 as the scheduled launch date for Sterna first acquiring flight heritage…

View attachment 89429
View attachment 89433
View attachment 89431
View attachment 89432


View attachment 89434
View attachment 89435 View attachment 89436 View attachment 89437
View attachment 89438
Speaking of EDGX - they just closed a €2.3 million funding round:


View attachment 89440


View attachment 89441



EDGX sluit een financieringsronde van € 2,3 miljoen af om AI-computing aan boord van satellieten te stimuleren​

8 aug 2025
De financiering zal de missie van EDGX versnellen om ’s werelds snelste AI-aangedreven edge-computers voor satellietconstellaties te leveren, waardoor snelle en efficiënte gegevensverwerking vanuit de ruimte mogelijk wordt.

De Belgische ruimtevaartstartup EDGX heeft een seed-financieringsronde van € 2,3 miljoen afgesloten om de commercialisering te versnellen van EDGX Sterna, de volgende generatie edge AI-computer voor satellieten.

De startup heeft ook een overeenkomst gesloten met een satellietoperator ter waarde van € 1,1 miljoen en kan nu al plannen aankondigen voor een demonstratie in de ruimte tijdens een SpaceX Falcon 9-missie in februari 2026.
De financieringsronde werd mee geleid door het imec.istart future fund, met deelname van het Flanders Future Tech Fund, dat wordt beheerd door de Vlaamse investeringsmaatschappij PMV. EDGX heeft ook verdere financiering aangetrokken van bestaande investeerder imec.istart, Europa’s best gerangschikte universiteitsgebonden accelerator.

De EDGX Sterna-computer is een krachtige gegevensverwerkingseenheid (DPU) die wordt aangedreven door NVIDIA-technologie. Deze biedt de rekenkracht en AI-versnelling die nodig zijn om snel en efficient data te verwerken aanboord van sattelieten. Dit maakt een einde aan de traditionele bottleneck waarbij enorme hoeveelheden ruwe gegevens naar de aarde moeten worden gestuurd voor verwerking, waardoor satellietoperatoren snellere, efficiëntere en datagestuurde diensten kunnen leveren.

De Sterna-computer van EDGX wordt aangedreven door hun SpaceFeather-softwarestack, die is gebouwd voor autonome, veerkrachtige en upgradebare satellietoperaties. Deze omvat een ruimtebestendig Linux-besturingssysteem met volledige traceerbaarheid, een speciaal toezichtsysteem voor autonome gezondheidsmonitoring, detectie en herstel van stralingsfouten, en een applicatieframework in een baan om de aarde voor het implementeren van nieuwe mogelijkheden na de lancering. Samen maken SpaceFeather en Sterna slimmere, flexibelere missies mogelijk met minder downtime, lagere kosten en snellere oplevering van data aan de eindgebruikers.

De techniek en het ontwerp van EDGX combineren commerciële AI-versnelling met betrouwbaarheid van ruimtevaartkwaliteit, waardoor exploitanten van satellietconstellaties kunnen beschikken over een niveau van rekenkracht aan boord dat voordien niet realiseerbaar was. Klanten gebruiken de Sterna DPU van EDGX en de bijbehorende SpaceFeather-software voor verschillende toepassingen.

Voor spectrum monitoring maakt Sterna krachtige verwerking in de ruimte mogelijk om radiosignalen te lokaliseren en te classificeren, en om dynamische spectrumkaarten te genereren. Dit is een essentiële capaciteit die satellietoperatoren helpt om in real-time te begrijpen hoe frequenties worden gebruikt, interferentie te vermijden en bandbreedte efficiënter toe te wijzen om optimale communicatiediensten te leveren.

Op het gebied van aardobservatie ondersteunt Sterna intelligente surveillance en verkenning (ISR) door rechtstreeks aan boord hoge-resolutiebeelden te analyseren. Dit betekent dat satellieten objecten zoals schepen, voertuigen of infrastructuur onmiddellijk kunnen detecteren en markeren, en kunnen reageren op tijdgevoelige gebeurtenissen zoals overstromingen, bosbranden of aardbevingen. Het resultaat: snellere beslissingen, efficiëntere missies en levensreddende informatie die passieve observatie omzet in realtime situationeel bewustzijn.

Sterna ondersteunt ook 5G en 6G vanuit de ruimte. Door de verwerkingscapaciteiten van basisstations naar de ruimte te verplaatsen kunnen satellieten rechtstreeks deelnemen aan mobiele netwerken van de volgende generatie. Dit maakt de weg vrij voor naadloze directe connectiviteit met apparaten en levert supersnel internet aan afgelegen, achtergestelde of door rampen getroffen gebieden waar traditionele infrastructuur tekortschiet.

nick_en_wouter_2-1920x1282.jpg



Naast de demonstratie in een baan om de aarde in februari staan er al twee verdere vluchten gepland voor 2026 en EDGX positioneert zich snel als leider op het gebied van AI ruimte-infrastructuur.

In een reactie op het nieuws zei Nick Destrycker, oprichter en CEO: “Klanten wachten niet op vluchtvalidatie, ze tekenen nu al. Met een volledig lanceringsmanifest, gegarandeerde commerciële contracten en onze eerste missie aanboord van een Falcon 9 rakket van SpaceX, stelt deze financiering ons in staat om op te schalen om te voldoen aan de vraag naar realtime informatie vanuit de ruimte.”

Wouter Benoot, oprichter en CTO van EDGX, zei: “Van nul naar honderd gaan, all-in, bij een ruimtevaartstartup is ambitieus. Die mentaliteit meenemen in de ontwikkeling van Sterna betekende nieuwe uitdagingen, voortdurend leren, maar ook echte vooruitgang. Wat het laat slagen is het team. Elke ingenieur brengt nieuwe ideeën, een drang om de ruimte te begrijpen en een passie om het te realiseren. We bouwen een subsysteem dat de volgende generatie satellieten aandrijft.”

Roald Borré, hoofd Venture Capital en lid van het uitvoerend comité bij PMV, zei: “Deze financieringsronde stelt ons in staat om het sterke team van EDGX te ondersteunen bij het op de markt brengen en verder ontwikkelen van veelbelovende Vlaamse technologie. EDGX is een van de weinige Europese spelers die een product aanbiedt dat krachtig, toegankelijk en robuust is, waardoor het unieke voordelen biedt in de snelgroeiende markt voor edge computing in de ruimte, niet in het minst wat betreft het versterken van de technologische positie van Europa in een strategische sector als ruimtevaart.”

Kris Vandenberk, managing partner bij imec.istart future fund, zei: “EDGX vertegenwoordigt precies het soort transformatieve infrastructuur waarnaar we op zoek zijn. De ruimtevaartindustrie kampt met een fundamentele bottleneck: we genereren enorme hoeveelheden data in een baan om de aarde, maar gebruiken nog steeds verouderde ‘store and forward’-architecturen. EDGX lost dit op door AI-aangedreven edge computing rechtstreeks in de ruimte te brengen, waardoor satellieten gegevens in realtime kunnen analyseren en erop kunnen reageren in plaats van te wachten op verwerking op de grond.”


P1034105-1920x1080.jpg


Over EDGX

EDGX is een Belgisch ruimtevaartbedrijf met als missie ’s werelds snelste edge computers voor satellieten te leveren. Het vlaggenschipproduct, EDGX Sterna, is een krachtige AI-gegevensverwerkingseenheid op basis van de NVIDIA Jetson Orin, die realtime gegevensverwerking aan boord in een baan om de aarde mogelijk maakt. EDGX bedient de markten voor satellietcommunicatie/telecommunicatie, aardobservatie en in-orbit servicing in commerciële, overheids- en defensiesegmenten, met een bijzondere focus op het mogelijk maken van AI en krachtige gegevensverwerking op schaal in satellietconstellaties. EDGX is opgericht in 2023, heeft zijn hoofdkantoor in Gent, België, en zal in februari 2026 zijn eerste demonstratie in een baan om de aarde lanceren met een SpaceX Falcon 9.

www.edgx.space

press@edgx.space



Translation into English by Google Translator:


EDGX closes €2.3 million funding round to boost AI computing onboard satellites

August 8, 2025
EQUITY INVESTMENTS
VENTURE CAPITAL

The funding will accelerate EDGX's mission to deliver the world's fastest AI-powered edge computers for satellite constellations, enabling fast and efficient data processing from space.

Belgian space startup EDGX has closed a €2.3 million seed funding round to accelerate the commercialization of EDGX Sterna, its next-generation edge AI computer for satellites.

The startup has also signed a €1.1 million deal with a satellite operator and can already announce plans for an in-space demonstration during a SpaceX Falcon 9 mission in February 2026.

The funding round was co-led by the imec.istart future fund, with participation from the Flanders Future Tech Fund, which is managed by the Flemish investment company PMV. EDGX has also attracted further funding from existing investor imec.istart, Europe’s top-ranked university-based accelerator.

The EDGX Sterna computer is a high-performance data processing unit (DPU) powered by NVIDIA technology. It provides the computing power and AI acceleration needed to quickly and efficiently process data onboard satellites. This eliminates the traditional bottleneck of sending massive amounts of raw data back to Earth for processing, enabling satellite operators to deliver faster, more efficient, and data-driven services.

EDGX's Sterna computer is powered by its SpaceFeather software stack, built for autonomous, resilient, and upgradeable satellite operations. This includes a space-hardened Linux operating system with full traceability, a dedicated monitoring system for autonomous health monitoring, radiation fault detection and recovery, and an in-orbit application framework for implementing new capabilities after launch. Together, SpaceFeather and Sterna enable smarter, more flexible missions with less downtime, lower costs, and faster data delivery to end users.

EDGX's engineering and design combine commercial AI acceleration with spacecraft-grade reliability, enabling satellite constellation operators to access previously unattainable levels of onboard computing power. Customers are using EDGX's Sterna DPU and its associated SpaceFeather software for a variety of applications.

For spectrum monitoring, Sterna enables powerful space-based processing to locate and classify radio signals and generate dynamic spectrum maps. This is a critical capability that helps satellite operators understand in real time how frequencies are being used, avoid interference, and allocate bandwidth more efficiently to deliver optimal communication services.

In Earth observation, Sterna supports intelligent surveillance and reconnaissance (ISR) by analyzing high-resolution imagery directly on board. This means satellites can immediately detect and tag objects such as ships, vehicles, or infrastructure, and respond to time-sensitive events like floods, wildfires, or earthquakes. The result: faster decisions, more efficient missions, and life-saving information that transforms passive observation into real-time situational awareness.

Sterna also supports 5G and 6G from space. By moving base station processing capabilities into space, satellites can participate directly in next-generation mobile networks. This paves the way for seamless direct connectivity to devices, delivering high-speed internet to remote, underserved or disaster-affected areas where traditional infrastructure falls short.

View attachment 89443

“With a full launch manifest, secured commercial contracts, and our first mission aboard a SpaceX Falcon 9 rocket, this funding allows us to scale to meet the demand for real-time information from space.”
NICK DESTRYCKER, FOUNDER AND CEO EDGX

In addition to the orbital demonstration in February, two further flights are already planned for 2026, and EDGX is quickly positioning itself as a leader in AI space infrastructure.

Commenting on the news, Nick Destrycker, founder and CEO, said: “Customers aren’t waiting for flight validation; they’re signing now. With a full launch manifest, secured commercial contracts, and our first mission aboard a SpaceX Falcon 9 rocket, this funding allows us to scale to meet the demand for real-time information from space.”

Wouter Benoot, founder and CTO of EDGX, said: “Going from zero to 100, all-in, at a space startup is ambitious. Bringing that mindset to Sterna’s development meant new challenges, continuous learning, but also real progress. What makes it work is the team. Each engineer brings new ideas, a drive to understand space, and a passion to make it a reality. We’re building a subsystem that will power the next generation of satellites.”

Borré, Head of Venture Capital and Member of the Executive Committee at PMV, said: “This funding round enables us to support EDGX’s strong team in bringing promising Flemish technology to market and further developing it. EDGX is one of the few European players offering a product that is powerful, accessible, and robust, giving it unique advantages in the rapidly growing space edge computing market, not least in terms of strengthening Europe’s technological position in a strategic sector like space.”

Kris Vandenberk, Managing Partner at imec.istart Future Fund, said: “EDGX represents exactly the kind of transformative infrastructure we are looking for. The space industry faces a fundamental bottleneck: we generate enormous amounts of data in orbit, yet still use outdated ‘store and forward’ architectures. EDGX solves this by bringing AI-powered edge computing directly into space, enabling satellites to analyze and act on data in real time, instead of waiting for processing on the ground.”

View attachment 89442

About EDGX

EDGX is a Belgian space company with a mission to deliver the world's fastest edge computers for satellites. Its flagship product, EDGX Sterna, is a powerful AI data processing unit based on the NVIDIA Jetson Orin, enabling real-time onboard data processing in orbit. EDGX serves the satellite communications/telecommunications, Earth observation, and in-orbit servicing markets across commercial, government, and defense segments, with a particular focus on enabling AI and high-performance data processing at scale in satellite constellations. Founded in 2023, EDGX is headquartered in Ghent, Belgium, and will launch its first in-orbit demonstration in February 2026 aboard a SpaceX Falcon 9.

www.edgx.space

press@edgx.space

Here is an article published yesterday about our Belgian partner EDGX, their Sterna DPU, designed for smallsats and powered by NVIDIA Jetson Orin NX technology, and its first launch into low earth orbit planned for early next year.

The author also mentions that despite the lack of flight heritage, the start-up from Ghent has already secured one large customer deal worth over € 1 million as well as two smaller customers, and that EDGX has identified three key use cases for their space-based edge compute platform:


“EdgX formed a development agreement with the European Space Agency in early 2024 to define an onboard neuromorphic, Data Processing Unit (DPU) tailored for satellite communication constellations in Low Earth Orbit (LEO).
It has a demo launch and flight planned for early 2026 on SpaceX’s Falcon 9, and says it has one large customer deal worth over EUR1 million and two smaller customers already – that’s unusual for a company with no space flight heritage, Destrycker adds.

The company has identified three key use cases for its edge compute device. One is to enable on-board AI processing for spectrum analysis and resource allocation. Another is earth observation and surveillance use cases, and the third is for satellite-based communications, including cellular NTN
.”





EdgX aiming high with edge compute platform for NTN​

By Keith Dyer 3 September 2025

It's edge compute, but not as we know it. Belgian start-up reveals plans to enable 5G NTN with space-based edge compute platform.​

A Belgian start-up says it has developed an on-board computing platform for satellites that can support 5G base stations and AI workloads in space.

The company, EdgX has developed the Sterna computer – or Data Processing Unit – to provide in-orbit compute and AI processing capabilities.

“We’re focussed on developing the world’s fastest edge compute service on board satellites,” says CEO Nick Destrycker. The device uses Nvidia’s Jetson Orin NX 16GB module, manufacturing it within specially designed hardware to withstand vibration, harsh thermal and vacuum conditions and radiation effects. EdgX also developed its own software operating system to enable on-board updating and software-defined operations, for example to be able to detect and recover from events such as particle radiation without a full reboot.

EdgX formed a development agreement with the European Space Agency in early 2024 to define an onboard neuromorphic, Data Processing Unit (DPU) tailored for satellite communication constellations in Low Earth Orbit (LEO).
It has a demo launch and flight planned for early 2026 on SpaceX’s Falcon 9, and says it has one large customer deal worth over EUR1 million and two smaller customers already – that’s unusual for a company with no space flight heritage, Destrycker adds.

The company has identified three key use cases for its edge compute device. One is to enable on-board AI processing for spectrum analysis and resource allocation. Another is earth observation and surveillance use cases, and the third is for satellite-based communications, including cellular NTN.


Co-founder and CTO Wouter Benoot


Wouter Benoot, co-founder and CTO, says that with the end use case of a regenerative gNodeB in space, you need to have processing autonomy on board the device.

“It’s impossible to for you to manage and operate everything from the ground and take care of all the data streams that are passing by, live on the spot. For example if you want a service that is really resilient against interference, as the spectrum is more and more crowded, then you have to move decision making on board.”

Benoot said that EdgX is seeing end customers interested in being able to do spectrum monitoring and intelligent radio resource allocation, but finding that the traditional space network infrastructure is not built to be open.

Cellular 5G NTN specifications are changing that, as is the concurrent drive towards developing RAN functions that work in containerised platforms. Then the push towards AI-RAN is also beneficial, with its concept of placing AI compute resources near to or at a cell site.

EdgX’s goal is to leave the lower PHY layer alone, but to run upper layer intelligence and processing on its DPU, with an integration between the two.
The driver for that new upper layer platform is that it is too difficult to engineer current FPGA-based systems to integrate well with AI.

“At the upper level layers, programming the entire stack in FPGA not a fun challenge. That’s where the DPU comes in, integrating high performance CPU and GPU AI core processing to alleviate those use cases and tie it into the infrastructure.”

“Our idea is to have a functional split on board where the lower level PHY layers are on FPGA processors and Sterna takes care of the upper layers where intelligence needs to happen. Because we are a fully Software Defined computer, we can play into third-party application software stacks as know that most satellite operators already have a preference for the type of software stack that they want to work with.”

Benoot says EdgX is already carrying out demonstrators with Open RAN software stacks, name-checking the likes of srsRAN, with that functional split running on the FPGA.

“Our goal is an in-orbit demo that ties AI use cases nicely into that computation stack, allocating resources go around interference, or processing earth observation to improve satellite services.”

EdgX recently closed a €2.3 million seed funding round co-led by the imec.istart future fund and, with participation from the Flanders Future Tech Fund, managed by the Flemish investment company PMV.





I couldn’t help notice that the word “neuromorphic” only comes up once in that article, but neither BrainChip nor Akida get referenced at all:

“EdgX formed a development agreement with the European Space Agency in early 2024 to define an onboard neuromorphic, Data Processing Unit (DPU) tailored for satellite communication constellations in Low Earth Orbit (LEO).”

I then revisited SatSearch, which had listed Akida as an optional neuromorphic add-on under the EDGX DPU’s “Key Features” back in May:


The EDGX DPU, based on an NVIDIA Jetson Orin NX and designed for LEO (Low Earth Orbit) missions, can now be sourced through https://satsearch.co/ - with an optional neuromorphic BrainChip Akida add-on:




View attachment 84896

View attachment 84897 View attachment 84898


View attachment 84899 View attachment 84900



View attachment 84901

Screenshot taken on 21 May 2025:


2D1727CA-0108-4F6D-AF60-9BB7FC565A92.jpeg



Turns out that strangely this add-on option has since been removed, while the remaining “Key Features” appear to be unchanged. 🤔


063D7393-D38A-4A58-B47E-46D287D4BC9A.jpeg





As I noted back in May, EDGX remains listed as a partner under “Enablement Partners” on the BrainChip “Partners” webpage, but has not made it to our company’s landing page - and neither have Neurobus, MulticoreWare (which to this day have never even officially been announced as a partner) and MYW.AI.
Not to mention ANT61, which doesn’t show up at all under “Partners”!

We can only hope it’s mere sloppiness…


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-462834
D35EFC8C-745E-4D76-8AB7-5018E8032B04.jpeg
 
  • Like
  • Thinking
  • Sad
Reactions: 20 users

Frangipani

Top 20
Can't recall if this recent presentation paper at MWSCAS has been posted (probs) but thought put it up as couldn't be bothered doing a search.

From a group out at Dayton Uni and nice results for Akida against Loihi.






13:30-13:45, Paper TueLecB04.1
Ultra-Efficient Network Intrusion Detection Implemented on Spiking Neural Network Hardware (I)

Islam, RashedulUniversity of Dayton
Yakopcic, ChrisUniversity of Dayton
Rahman, NayimUniversity of Dayton
Alam, ShahanurUniversity of Dayton
Taha, TarekUniversity of Dayton
Keywords: Neuromorphic System Algorithms and Applications, Machine Learning at the Edge, Other Neural and Neuromorphic Circuits and Systems Topics
Abstract: Network intrusion detection is crucial for securing data transmission against cyber threats. Traditional anomaly detection systems use computationally intensive models, with CPUs and GPUs consuming excessive power during training and testing. Such systems are impractical for battery-operated devices and IoT sensors, which require low-power solutions. As energy efficiency becomes a key concern, analyzing network intrusion datasets on low-power hardware is vital. This paper implements a low-power anomaly detection system on Intel’s Loihi and Brainchip’s Akida neuromorphic processors. The model was trained on a CPU, with weights deployed on the processors. Three experiments—binary classification, attack class classification, and attack type classification—are conducted. We achieved approximately 98.1% accuracy on Akida and 94% on Loihi in all experiments while consuming just 3 to 6 microjoules per inference. Also, a comparative analysis with the Raspberry Pi 3 and Asus Tinker Board is performed. To the best of our knowledge, this is the first performance analysis of low power anomaly detection based on spiking neural network hardware.

Just like a few weeks ago, when you yourself had posted about that same conference paper, I’m going to reply with this: 😉


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-470870

8C9FD355-5B82-4B56-B19B-601321900A62.jpeg
5D4D667B-A113-4AC5-84FE-F8EBAC2BFA23.jpeg
 
  • Like
  • Thinking
  • Love
Reactions: 7 users

Frangipani

Top 20
Posters who love to diss our competition will surely say “But… they’re not digital… they don’t have on-chip learning… they’re only good enough for harvesting the low-hanging fruit etc.”, yet, when they really open their eyes, they cannot deny that the team behind Delft-based Innatera are making strides in targeting those low-hanging fruit (which are also part of BrainChip’s market!) and will have to acknowledge that Innatera are very present both on social media and at real-life events (which - at the same time - is good for us, as it helps to spread awareness of the benefits of neuromorphic computing in general).

Some recent examples:


94167387-F196-4493-B1C2-7F8C4E4FA112.jpeg





C4FD3A11-5EC6-4E94-97A3-C0B3C8CD104C.jpeg




20A11C55-6C02-4618-9DEA-0FD73475E065.jpeg



51E5910E-0715-4457-8AE2-EBC68D3B62CF.jpeg



The founder of Innatera partner CYRAN AI Solutions, Manan Suri, is in turn closely connected to TCS Research - he has been a Research Advisor to the TCS Innovation Team on Neuromorphic Computing and Edge AI since June 2021…

The below LinkedIn post is another reminder that our friends at TCS Research - Sounak Dey, Arijit Mukherjee & Arpan Pal - are not exclusively friends with us, when it comes to neuromorphic computing, but also close friends with others developing their own technology in that field, eg. Manan Suri and his research group at IIT Delhi. In 2018, Manan Suri founded CYRAN AI Solutions as a spinoff from that uni lab and has been a Research Advisor to the TCS Innovation Team on Neuromorphic Computing and Edge AI since June 2021.


View attachment 82693 View attachment 82694



View attachment 82699

View attachment 82700





This Engineer’s Hardware Is Inspired by the Brain​

Manan Suri’s neuromorphic systems run AI on sensors, drones, and VR headsets​

EDD GENT
23 JAN 2024
5 MIN READ

A man in a suit standing in front of a large machine that has yellow wires coming from it.

Manan Suri displays a wafer-level testing system for unpackaged chips and devices built by researchers at the Indian Institute of Technology Delhi.
MANAN SURI


In work and in life, it’s easy to get stuck in your ways. That’s why Manan Surihas always looked to expand his horizons both professionally and personally.
Growing up in India, he was used to transitions and new experiences because his family frequently moved around the country as his father relocated for his job as a chemical engineer. Traveling stuck with Suri in adulthood. He studied and worked in Dubai, the United States, France, and Belgium over the course of his twenties.


Manan Suri

EMPLOYER:
Indian Institute of Technology Delhi
OCCUPATION:
Associate professor and founder of Cyran AI Solutions, New Delhi
EDUCATION:
Bachelor’s and master’s degrees in electrical and computer engineering, both from Cornell; Ph.D. in nanoelectronics from the CEA-Leti research institute in Grenoble, France

Eventually, Suri moved back to India to become an assistant professor at the Indian Institute of Technology Delhi. There he set up a research group focused on developing brain-inspired (neuromorphic) computer hardware for low-power devices like sensors, drones, and virtual-reality headsets. He is now an associate professor.

He also launched a startup to commercialize his lab’s expertise: Cyran AI Solutions, based in New Delhi, works with companies and government agencies on a variety of projects. These include automating the inspection process for identifying defects in semiconductors and developing computer-vision systems to improve crop yields and analyze geospatial Earth-observation data.
While balancing a career in academia and industry is challenging, Suri says, he relishes the opportunity to constantly learn.


“Once I’ve figured out how a system works, I start getting bored,” he says.
Suri, an IEEE member, believes that embracing change is a key ingredient for success. This is what has driven him to continually move on to new projects, push into new disciplines, and even move from country to country to experience a different way of life.

“It accelerates your ability to learn new things,” he says. “It puts you on a fast trajectory and helps shed some of your inhibitions or get over the inertia in what you’re doing or how you’re living.”



Inspired by Cornell’s semiconductor lab

Growing up, Suri’s passion was physics, but he quickly realized he was drawn more to the practical applications than theory. This led to a fascination with electronics.

In 2005 he initially enrolled at the Birla Institute of Technology and Science, Pilani, in India, and studied electronics and instrumentation at the institute’s campus in Dubai. After his second year, he transferred to Cornell, in Ithaca, N.Y. His first six months living in the United States, acclimating to a new culture and a different academic environment, were overwhelming, Suri says. What hooked him were Cornell’s high-end facilities available to students studying semiconductor engineering and nanofabrication—in particular, the industry-grade semiconductor clean rooms.

He earned a bachelor’s degree in electrical and computer engineering in 2009 and a master’s degree in the same subject the following year.


New skills in computational neuroscience

After graduating, Suri received offers for Ph.D. positions in the United States and Europe to work on conventional electronics projects. But he didn’t want to get pigeonholed as a traditional semiconductor engineer. He was intrigued by an offer to study neuromorphic systems at the CEA-Leti research institute in Grenoble, France. He was also eager to broaden his life experience and get a taste of the European way of doing things.

The work would push Suri to develop new skills in computational neuroscience and computer science. In 2010 he started a Ph.D. program in the institute’s Advanced Memory Technology Group. There he worked on low-power AI hardware that uses new kinds of nonvolatile memory to emulate how biological synapses process data. This involved using phase-change memory and conductive-bridging RAM to create neural networks for visual pattern extraction and auditory pattern sensitivity.

Suri discovered that his experience with electronics allowed him to approach neuromorphic engineering problems from an entirely different angle than his colleagues had considered. Experts can develop fairly rigid and conventional ways of thinking about their own field, he says, but when those with different skill sets apply them to the same problems, it can often lead to more innovative thinking. “You bring a completely different perspective,” he says. “It leads to a lot of creativity.”


Setting up his own research lab

After finishing his doctorate in nanoelectronics, Suri got a job working on high-voltage transistors for automotive applications at the semiconductor designer NXP Semiconductors, in Brussels. Since his role was to take a project all the way from concept to fabrication, it was as close to pure research as he could get in industry. But as interesting as the work was, Suri says, he missed the intellectual freedom of academia.

When the opportunity of setting up his own lab at IIT Delhi came along, he jumped at it. He had also been away from his home country for almost a decade and wanted to be closer to family and contribute to the Indian science and technology ecosystem, he says.
“Moving abroad was more a matter of collecting experiences and seeing how different places work,” he says.

Suri’s group at IIT Delhi has made contributions to AI hardware, neuromorphic hardware, and hardware security. The group collaborates with industry research teams around the world, including Meta Reality Labs, Tata Consultancy Services, and GlobalFoundries.


Launching a startup

Despite returning to academia, Suri says he has always been interested in developing practical solutions to real-world challenges, and this goal has guided his research. Whatever project he works on, he always asks himself two questions: Will it solve a real problem? And will someone buy it?

Suri launched his startup in 2018 to turn some of his lab’s work in AI and neuromorphic hardware into commercial products. Cyran AI Solutions’ customers hire the company to solve a range of problems. These have included computer-vision systems for detecting defects in computer chips; hyperspectral data-analysis algorithms designed to run in real time on chips for crop-inspection drones; and AI systems for small, low-power devices and challenging environments like satellites.


A man is sitting at a table working on a circuit board and four machines that have red and black wires inserted into them.

Manan Suri and researchers at the Indian Institute of Technology Delhi’s lab designed this custom electrical test setup for the characterization of memory-computing chips. MANAN SURI

While Cyran makes use of its neuromorphic expertise for some problems, it often uses more mature and simpler-to-deploy machine-learning approaches.

“Most users don’t really care about what technology we are using,” Suri says. “They just want functional performance at the most cost-effective price.”

One of the biggest lessons Suri learned from running a startup is to consider the market being served. For earlier projects, he says, the company often devised a solution that was specific to just one customer’s needs and couldn’t be repurposed for other uses. To create a sustainable business, he realized he needed to develop generic solutions that could be deployed more broadly.

“Running Cyran has been like [pursuing a] mini-MBA,” he says. “You need to really pay attention to the market aspects and not just the technology.”


In 2018, MIT Technology Review named Suri one of its 35 Innovators Under 35for his work on neuromorphic computing.

The need to be hands-on​

Keeping a foot in both academia and industry can be challenging, Suri says. Facing resource crunches, whether in time, staffing, or funding, is common. The only way he’s able to manage things is to plan extensively and remain nimble, building in contingencies.

If you can manage it, Suri says, having your fingers in many pies can have major benefits. In particular, working on problems that bridge several disciplines can help you break out of rigid thinking and come up with novel solutions.

It’s not possible to dedicate equal amounts of time to learning every area, he says, so he advises up-and-coming engineers to carefully pick the topics that are most likely to advance their progress. It’s also crucial to dive in and get your hands dirty, rather than focusing on theory, initially.

“Take the plunge and try and figure it out,” he recommends. “As the problem unravels, then you can start getting into the theory or the more formal aspects of the project. You also start to appreciate learning more about the theory as it gets more hands-on.”



View attachment 82698

… and we know that Sounak Dey from TCS Research does not have Akida-only blinders on when it comes to neuromorphic computing:


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-438883

0FD40103-4586-4F28-A6FE-3B8BB9FD76E1.jpeg


Also cf. this June 2025 paper:

Despite our years of collaboration with TCS researchers and the above encouraging affirmation of an “active alliance” with Tata Elxsi, we should not ignore that TCS are also exploring other neuromorphic options for what will ultimately be their own clients.

And while a number of TCS patents do refer to Akida as an example of a neuromorphic processor that could be utilised, they always also refer to Loihi, as far as I’m aware of.

A recent case in point for Tata Consultancy Research’s polyamory is the June 2025 paper “The Promise of Spiking Neural Networks for Ubiquitous Computing: A Survey and New Perspectives”, co-authored by five Singapore Management University (SMU) researchers as well as Sounak Dey and Arpan Pal from TCS, both very familiar names to regular readers of this forum.

Although we know those two TCS researchers to be fans of Akida, they sadly did not express a preference for BrainChip’s neuromorphic processor over those from our competitors in below paper published less than six weeks ago.
On the contrary, in their concluding “key takeaway” recommendations of neuromorphic hardware (“We make the following recommendations for readers with different needs considering neuromorphic hardware chipsets”), the seven co-authors do not even mention Akida at all.

Even more surprisingly, the section on Akida is factually incorrect:
- AKD1500 is a first generation reference chip and is not based on Akida 2.0, BrainChip’s second generation platform that supports TENNs and vision transformers.
- An AKD2000 reference chip does not (yet) exist - it may or may not materialise. At present, only Akida 2.0 IP is commercially available - not an actual silicon chip, as claimed by the paper’s authors.
- The paper is in total ignorance of ultra-low-power Akida Pico, operating on less than 1mW of power, which was revealed by our company back in October 2024 and is based on Akida 2.0.
It is highly unlikely this (possibly revised) version of the paper published on 1 June 2025 would have been submitted to arXiv prior to BrainChip’s announcement of Akida Pico, and we can safely assume Sounak Dey and Arpan Pal would have been aware of that October 2024 BrainChip announcement (unlike maybe their SMU co-authors).

One could argue the reason Akida Pico is not mentioned could possibly be that an actual Akida Pico chip is not commercially available, yet, given the authors state

“5.2 Neuromorphic Hardware
In this subsection, we summarize the latest commercially available neuromorphic hardware chipsets, highlighting their capabilities and development support for building and deploying spiking neural networks.”,

which, however, in turn begs the question, why Loihi 2 is listed, then, as it was always conceptualised as a research chip and is not commercially available. In the paragraph on Loihi 2, the authors correctly state that “this neuromorphic research chipset is available only through the Intel Neuromorphic Research Community (INRC).”

Given the fact that Sounak Dey and Arpan Pal co-authored this paper, the above inaccuracies are bewildering, to say the least. Did the two TCS researchers who both have firsthand experience with Akida contribute to only part of this paper and not proofread the final version before it was submitted?

Either way not a good look…




View attachment 88480

(…)

View attachment 88481 View attachment 88482 View attachment 88483
(…)

View attachment 88484





BADC4DB0-6A9C-46B0-97C7-936664E9AE74.jpeg



Then there was this video by Anastasiia Nosova in her “Anastasi In Tech” YouTube channel a few weeks ago:

Interesting !!





In recent months, Innatera have also been really chummy with Pete Bernard and the Edge AI Foundation team:













8E3E1771-4EEC-4904-94FE-08C475B41CEA.jpeg
 
Last edited:
  • Like
  • Thinking
  • Wow
Reactions: 12 users
Top Bottom