BRN Discussion Ongoing

7für7

Top 20
Low volume today ..
 
  • Thinking
Reactions: 2 users

Tothemoon24

Top 20
IMG_1712.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 21 users

manny100

Top 20
"Jesse ChapmanJesse Chapman • 3rd+3rd+Verification EngineerVerification Engineer1w • 1 week ago • Visible to anyone on or off LinkedIn
Follow

BrainChip & Parsons Corporation Join Forces to Bring Neuromorphic AI to the Tactical Edge

In a significant step for edge AI and defense technology, BrainChip Holdings Ltd (ASX: BRN) has announced a multi-year strategic partnership with Blue Ridge Envisioneering (BRE), a Parsons company (NYSE: PSN). The collaboration aims to integrate BrainChip’s Akida™ hashtag#neuromorphic technology into Parsons’ mission-ready edge-AI platforms, delivering real-time, low-power hashtag#intelligence in environments where cloud connectivity is limited or unavailable.

The partnership will leverage Akida’s event-based, energy-efficient architecture and on-device learning capabilities, enabling adaptive AI performance in constrained, dynamic defense scenarios. Notably, Parsons gains access to BrainChip’s AKD1500 processor, which recently commenced tape-out for volume production, positioning the chip for deployment in both defense and broader commercial applications."
The 'secret sauce' for Parsons is the neuromorphic on line learning capabilities - see bold above. That is why we are set for a big future in defense.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 22 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This is also an interesting development!

Airbus, Leonardo and Thales have now signed an MOU.They aren’t just collaborating. They’re actually planning to form a new combined company (a new European space player) with a target of being operational by 2027 (pending approvals).

As we know, BrainChip already has an entry point with Airbus. In an October 2024 quarterly update, BrainChip reported agreements amounting to €190,000 with Frontgrade Gaisler and Airbus Defence & Space, intended for ultra-low-power AI in space applications.

If Airbus adopts neuromorphic edge AI for a particular sensor role, even at small volume in the beginning, then once this new 2027 company exists, that architecture has a pathway not just to Airbus, but to Leonardo and Thales via this new company as well.

It says " The combined entity will employ around 25,000 people across Europe. With an annual turnover of about 6.5bn€ (end of 2024, pro-forma) and an order backlog representing more than three years of projected sales, this new company will form a robust, innovative and competitive entity worldwide.Ownership of the new company will be shared among the parent companies, with Airbus, Leonardo and Thales owning respectively 35%, 32.5% and 32.5% stakes. It will operate under joint control, with a balanced governance structure among shareholders."



Screenshot 2025-11-04 at 12.36.21 pm.png


 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 33 users

Newk R

Regular
I am not fluent in English but….. could it be that there are several grammatical errors on that flyer 👁️👄👁️

„mims nimos“, „ayschonnotus“ ?? “mimics … asynchronous“.??

„instany / localy locally / rel-tiime“ … „instantly / locally / real time“.??

„devices that smart“ … „devices that are smart“.

„do’t“ … „don’t“??


View attachment 92668
Is this real or somebody's BS
 

7für7

Top 20
Not even 20 🙄 I see the big news coming from tomorrow any time now
 
  • Love
Reactions: 1 users

manny100

Top 20
We all know that devices can communicate with each other, eg i see my phone messages on my lap top. Edge chips whether Neuromorphic or not in devices can communicate with each other if connected via some type of network not necessarily connected to the cloud. For Edge devices the network would mostly not be cloud connected (why would you bother).?
This where AKIDA shines with its on chip learning capabilities compared to other neuromorphic or traditional 'dumb' chips with no on chip learning capabilities.
Chips in autos, robots etc need to communicate. Robots in a team may need to send each other signals .
Its easy to see why Parsons see our on chip learning adaptability as so important. Great for space and commercial applications. Care would need to be taken in defense with networks even though will not be could connected.
Its also easy to how AKIDA well ahead of the pack.
I ask Chat GPT to summarise. Its common sense but saved me typing it.

Comparison Table: With vs. Without On‑Chip Learning​

AspectAkida with On‑Chip LearningChips without On‑Chip Learning
AdaptabilityCan learn new patterns locally (e.g., new anomalies, new commands) and immediately use them in communication.Limited to pre‑trained models; cannot adapt on their own. Must be retrained in the cloud or offline and redeployed.
Communication ContentSends newly learned events or adapted patterns to other devices (e.g., “new anomaly detected”).Sends only pre‑defined outputs from its fixed model (e.g., “class = 2”).
CollaborationDevices can share discoveries: one node learns something new and alerts others, enabling system‑wide adaptation.Devices can only share fixed detections; if one encounters a new pattern, others won’t recognize it until all are updated externally.
AutonomyHigh — devices can self‑improve and coordinate without cloud or human intervention.Lower — devices depend on external retraining and updates to handle new situations.
ResilienceIdeal for remote or disconnected environments (spacecraft, autonomous vehicles, industrial plants) where cloud updates aren’t possible.Vulnerable in disconnected environments — performance degrades if new patterns appear that weren’t in the original training.
Efficiency of CommunicationCommunicates only meaningful, newly learned events, reducing bandwidth.Communicates fixed outputs, which may include irrelevant or incomplete information if the model is outdated.
System IntelligenceCreates a distributed, adaptive network — like a “nervous system” that learns and shares in real time.Creates a static network — devices can coordinate, but only within the limits of their pre‑trained knowledge.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 12 users

Iseki

Regular
“Sony invested in Brainchip”

Here might be why.

"

Sony IMX501: Vision and Intelligence in One Package​

Parallel DSP Cores for Efficient AI Inference​

The main engine for AI processing is the DSP subsystem core. This core is comprised of a High-Computation-Intensity (CI) DSP core, Standard-CI DSP core, and Tensor DMAs (TDMAs). The High-CI and Standard-CI DSPs work in parallel, executing neural network operations and moving the data directly into TDMA and then into L2 memory. Image and inference data are then transferred via the MIPI interface to the camera’s FPGA."

ie Sony cameras also comes with an FPGA chip, which can run akida2.

While akida may not yet be in the sensor chip ( a la Prophesee) we might be next to it, performing deeper inferencing on the entire image

 
  • Like
  • Fire
Reactions: 7 users

BrainShit

Regular
“Sony invested in Brainchip”

Here might be why.

"

Sony IMX501: Vision and Intelligence in One Package​

Parallel DSP Cores for Efficient AI Inference​

The main engine for AI processing is the DSP subsystem core. This core is comprised of a High-Computation-Intensity (CI) DSP core, Standard-CI DSP core, and Tensor DMAs (TDMAs). The High-CI and Standard-CI DSPs work in parallel, executing neural network operations and moving the data directly into TDMA and then into L2 memory. Image and inference data are then transferred via the MIPI interface to the camera’s FPGA."

ie Sony cameras also comes with an FPGA chip, which can run akida2.

While akida may not yet be in the sensor chip ( a la Prophesee) we might be next to it, performing deeper inferencing on the entire image


Sony's IMX501 intelligent vision sensor contains its own proprietary parallel DSP cores for efficient AI inference and on-sensor processing, designed in-house, and uses a unique architecture to allow AI tasks to be run at the edge without external processors or cloud dependency.

Source: https://www.provideocoalition.com/s... Processor) dedicated to AI signal processing
 
  • Like
  • Fire
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Weebit's post on Linkedin 7 hours ago.

Screenshot 2025-11-04 at 7.48.50 pm.png
 
  • Like
  • Love
  • Thinking
Reactions: 16 users

Iseki

Regular
Sony's IMX501 intelligent vision sensor contains its own proprietary parallel DSP cores for efficient AI inference and on-sensor processing, designed in-house, and uses a unique architecture to allow AI tasks to be run at the edge without external processors or cloud dependency.

Source: https://www.provideocoalition.com/sony-reveals-the-worlds-first-intelligent-vision-sensor-with-ai/#:~:text=Sony’s original DSP(Digital Signal Processor) dedicated to AI signal processing
Sure. There will be inferencing on two levels. On the pixels (can you make the image sharper) and on the image on the FPGA chip ( is it a duck?) and akida could come in the second category is my point.

I only say this because the sony adverts show a lot of image composition boxes..
 
Last edited:
  • Like
Reactions: 1 users

Diogenese

Top 20
This train is getting up steam:

 
  • Like
Reactions: 4 users

BrainShit

Regular
Sure. There will be inferencing on two levels. On the pixels (can you make the image sharper) and on the image on the FPGA chip ( is it a duck?) and akida could come in the second category is my point.

I only say this because the sony adverts show a lot of image composition boxes..

Right, Akida could operate well in the second category (classifying objects in an image after the pixel-level processing is done) as an FPGA ... but I really doubt that will happen.

Because the models are used are specifically optimized for the memory and processing constraints of Sony IMX500/IMX501-equipped edge sensing devices... and Sony’s Brain Builder tooling (AITRIOS) is used in the workflows to train and deploy models targeted.

In other words, I really belife they're using their own developments.
 
  • Like
  • Thinking
Reactions: 3 users

manny100

Top 20
Bascom hunter.
The market is potentially huge.
" THE NAVY BENEFIT
The U.S. Navy can incorporate the technology developed into both modern and legacy weapons systems (via HOST and SOSA alignment), expect to outperform modern GPU systems running cutting edge algorithms; and require exceptionally low power usage. Moreover, this technology is scalable and, as mission needs evolve, super charging applications of interest to the Navy and the Department of Defense."
The point is that Bascom Hunter can integrate its card with AKIDA once testing is finalised straight into existing equipment.
Parsons should be able to do the same so our wait once the chips have been delivered may not be that long if they initially for drones.
Manned craft will be longer of course.
My bold above.
 
Last edited:
  • Like
  • Fire
Reactions: 6 users

Diogenese

Top 20
" "Developing advanced technologies that can be easily adapted, implemented, and integrated into both existing and future systems is key to delivering reliable and resilient communications capabilities to the warfighter."
McLaina Mazzone, Science & Technology Assistant Program Manager, PEO C4I, PMW/A 170"
The above from the Navy Transition success story.
The point is that Bascom Hunter can integrate its card with AKIDA once testing is finalised straight into existing equipment.
Parsons should be able to do the same so our wait once the chips have been delivered may not be that long if they initially for drones.
Manned craft will be longer of course.


Bascom Hunter Technologies developed an interference excision system (IES) solution, and its applications have gone far beyond the original intent. Unlike conventional radio frequency and digital interference mitigation or cancellation techniques, this technology provides previously unmatched levels of cancellation by using an electro-optic-based interference cancellation technique with more than 10,000 times the signal power range, all within a small package. This approach allows for the removal of strong interferers in the same channel as the signal of interest and allows for improved communications operations in a degraded or denied operating environment. This revolutionary effort paved the way for other interference excision devices and systems that target different types of interference for SATCOM systems. Additionally, it opens the door for new and advanced technologies and capabilities to improve resiliency to SATCOM systems beyond UHF.


I wonder if there is any overlap between the cancellation of unwanted electric signals and unwanted acoustic (hearing aid) signals.

Is it just a matter of having a model adapted for wireless signals?

Wireless signals are certainly within TENNs wheelhouse.
 
  • Like
  • Fire
  • Love
Reactions: 10 users

TECH

Regular
Who amongst us doesn't see us succeeding?

That's the question.

This sector isn't all about us and our technology, I understand that, for us to even think of dictating how the edge ai market will play out over the next 3 years plus would be a totally arrogant view to hold, one which our company, I'd suggest, wouldn't even contemplate.

Unless we establish ourselves in this market on our on two feet with real growing revenue streams, we are definitely a takeover target by a small number of contenders, one's who we are all aware of..keep an open mind, this share market bullshit will come to an abrupt end soon enough, let's go Sean, it's time to giddy up!🦄🦄
 
  • Like
  • Love
  • Fire
Reactions: 13 users

BrainShit

Regular
Who amongst us doesn't see us succeeding?

That's the question.

This sector isn't all about us and our technology, I understand that, for us to even think of dictating how the edge ai market will play out over the next 3 years plus would be a totally arrogant view to hold, one which our company, I'd suggest, wouldn't even contemplate.

Unless we establish ourselves in this market on our on two feet with real growing revenue streams, we are definitely a takeover target by a small number of contenders, one's who we are all aware of..keep an open mind, this share market bullshit will come to an abrupt end soon enough, let's go Sean, it's time to giddy up!🦄🦄

It is possible for BrainChip to be overtaken in general, but such an acquisition involves specific complexities due to national security concerns and government regulations... because BrainChip got some U.S. military contracts.

Source 1: https://www.governmentcontractslawb...g-government-contractors-and-their-suppliers/
Source 2: https://www.nationaldefensemagazine...ould-see-another-wave-of-mergers-acquisitions
 
  • Like
Reactions: 2 users

Frangipani

Top 20
A couple of years before I became a BRN shareholder, I came across a company from Austria called g.tec medical engineering GmbH (https://www.gtec.at), founded in 1999 by Christoph Guger and Günter Edlinger as a spin-out of TU Graz, and one of their products called mindBEAGLE, which uses Brain-Computer-Interface (BCI) technology for assessing patients suffering from disorders of consciousness or locked-in syndrome, can help with outcome prediction, and even provides very basic communication with some of them.

Over the past 25+ years, g.tec medical engineering have specialised “in developing high performance brain-computer interfaces and neurotechnologies for both invasive and non-invasive recordings in research and clinical settings” (https://www.gtec.at/about/) and are one of the leading companies, if not THE leading company, in this field world-wide.

In a November 2024 interview (https://www.gtec.at/2024/11/04/leading-the-bci-field/), Co-Founder and Co-CEO Christoph Guger, who has degrees from both Johns Hopkins University and TU Graz, shared the following about g.tec medical engineering’s impressive journey:

“We sell our platform to many Universities like Harvard, Stanford, Yale, and of course, Johns Hopkins and have expanded to 100 countries around the world.

Besides that, the systems are used for technology developments by major industrial players like BMW, Airbus, Meta, Apple, Amazon, and many more. About 10 years ago we started with the development of medical products that we sell to hospitals and rehabilitation clinics.

We established a franchise system that allows businessmen and therapists to use neurotechnology in their centers to treat patients. With our recoveriX system for the neurorehabilitation of stroke patients and patients with Multiple Sclerosis, we are already in many countries, and up to about 50,000 treatments were done.”


They have since also teamed up with Tobii to offer integrated EEG and eye-tracking technology.

Most of you will be familiar with the term Brain-Computer-Interface (BCI) - sometimes also called Brain-Machine-Interface (BMI) or Mind-Machine-Interface (MMI) - but may not be fully aware what it actually means.

In a December 2015 publication, Christoph Guger (that’s, by the way, where the G in g.tec comes from - it stands for Guger Technologies) and two of his co-authors described a BCI as follows:

A BCI is a device that reads voluntary changes in brain activity, then translates these signals into a message or command in real-time (…) Most BCIs rely on the electroencephalogram (EEG). These signals (also called “brainwaves”) can be detected with electrodes on the surface of the head. Thus, these “noninvasive” sensors can detect brain activity with very little preparation. Some BCIs are “invasive”, meaning that they require neurosurgery to implant sensors. These BCIs can provide a much more detailed picture of brain activity, which can facilitate prosthetic applications or surgery for epilepsy and tumor removal.”


The implants used in clinical trials by Neuralink (founded in 2016 by Elon Musk and a team of eight scientists and engineers) are the most well-known examples of invasive BCIs. And while we BRN shareholders tend to roll our eyes when our company’s silicon gets confused with Musk’s “brain chips”, there is no doubt that BrainChip’s technology is also being evaluated in this field of BCIs.


In 2020, g.tec medical engineering introduced the BCI & Neurotechnology Spring School, a free ten-day virtual event - now held annually - which has become the world’s largest neurotech event, orchestrated from a small town in Austria called Schiedlberg. Participants can access 140 hours of cutting-edge education and even earn 14 ECTS* credits and an official exam certificate at no cost.
*ECTS = European Credit Transfer and Accumulation System

I noticed that one of last year’s 82,000 (!) participants was Temi Mohandespour, who used to work as a research scientist at BrainChip’s now closed Perth office from March 2021 until January 2025. She has since moved to Berlin and now works for Data4life, a non-profit organisation, whose mission is to digitalise health data for research (www.data4life.care/en/).

https://www.linkedin.com/posts/temi-mohandespour_here-is-a-big-thank-you-to-gtec-medical-activity-7193097495894208513-9euk?


IMG_3147.jpeg


Several of her colleagues at BrainChip gave her above “thank you” post a 👍🏻, including our CTO.

While I wasn’t able to find out anything concrete about what Temi Mohandespour may have been working on relating to BCIs during her last nine months at BrainChip post-Spring School, I happened to discover the LinkedIn profile of someone else who worked not only on one, but on two BCI projects utilising Akida - although not as an employee of BrainChip:

https://www.linkedin.com/in/hammouamri-ilyass/


IMG_3186.jpeg



Ilyass Hammouamri, who recently defended his PhD thesis at the Université de Toulouse (https://doctorat.univ-toulouse.fr/as/ed/cv.pl?mat=140961&site=EDT)
and whose PhD supervisor was Timothée Masquelier (one of the four co-inventors of the JAST patent that BrainChip first licensed and later acquired), was a part-time research engineer at Neurobus between September 2024 and April 2025.

It was during that time - still under Gregor Lenz as CTO - that he “developed a Proof of Concept solution for motor imagery classification from a Dry EEG Headset using a BrainChip Akida neuromorphic chip for robotic arm control”.


“Motor imagery (MI) is a mental process in which a subject vividly imagines performing a movement without any actual physical execution. MI is widely used in BCI systems to enable control of external devices, such as a cursor on a screen or a robotic arm, through brain activity.”

https://docs.medusabci.com/kernel/1.4/tutorials.php (by the Biomedical Engineering Group at the University of Valladolid, Spain)


I wonder whether this project may have been the continuation of the BMI* project that Neurobus’s first employee, Ljubica Cimeša, had developed in collaboration with Airbus, which also used EEG signals for robotic control:

*The terms Brain-Computer-Interface (BCI) and Brain-Machine-Interface (BMI) are often used interchangeably.


https://www.linkedin.com/in/cimesa-ljubica/

463A4A79-29B4-415B-8AB0-C65A4C35219D.jpeg
D0B01094-2253-4C26-8CEA-5233B32603E7.jpeg






But his part-time contract job with Neurobus was not the first time Ilyass Hammouamri had been involved in BCI research using Akida: During his time at CNRS (Centre national de la recherche scientifique) CerCo (Centre de Recherche Cerveau et Cognition) in Toulouse, where he was as a PhD candidate in Timothée Masquelier‘s NeuroAI lab from to September 2021 to February 2025, he “worked on a joint project between different labs and BrainChip: Decoding speech from ECoG brain signals”.
Which means there must have been at least one more lab involved in that project, possibly more.


ECoG stands for electrocorticography. In contrast to EEG, it involves recording electrical activity directly from the surface of the brain und thus requires a craniotomy.


IMG_3022.jpeg


Here is a good illustration I found online, which happens to be from a video by g.tec medical engineering:

1762248159735.jpeg


I have no idea whether or not any of g.tec medical engineering’s products (such as wearable EEG headsets, biosignal amplifiers) were actually used for either of the two BCI projects that Ilyass Hammouamri was involved in.

What I can tell you, though, is that they list Airbus under “Happy Customers” alongside quite a few other interesting names (https://www.gtec.at/).
 
  • Like
  • Love
Reactions: 8 users
Top Bottom