BRN Discussion Ongoing

7für7

Top 20

Arm’s stock is rising as analysts say an intriguing move could be on the horizon​

The chip designer could be evolving into making its own custom AI chips — and BNP analysts say the opportunity isn’t fully priced into its stock yet​

By

Britney Nguyen
Follow

Last Updated: July 16, 2025 at 8:50 p.m. ET
Arm Holdings CEO Rene Haas at the Nasdaq opening bell.
Shares of Arm Holdings were up nearly 5% on Wednesday.Photo: Getty Images

Arm Holdings PLC could soon emerge as a major contender in the market for custom artificial-intelligence chips, and some analysts think the company’s stock is still waiting to benefit from the opportunity.

BNP Paribas analysts upgraded the chip designer’s stock to outperform from neutral and raised their price target to $210 from $110 in a Wednesday note. As tech giants continue to raise capital expenditures for AI efforts, the analysts see a natural evolution for Arm to become a maker of application-specific integrated circuit (ASIC) chips — though they noted that the company’s leadership has so far not made any official statements on an ASIC strategy.

Arm didn’t immediately respond to a MarketWatch request for comment.

The London-based company currently licenses out its chip designs, but doesn’t sell its own chips. Arm’s ASIC opportunity could double the company’s earnings before interest and taxes, the analysts said, if it captures just 7% of the total addressable market. They estimate this market will reach $200 billion by 2030; at that rate, the analysts think Arm’s hypothetical ASIC business could bring in between $8 billion and $15 billion in revenue by fiscal years 2030 and 2031.
Arm’s stock is up about 16% so far this year, and the BNP team sees more “significant upside to be had as the share price is not pricing in the ASIC potential.” Shares were up nearly 5% on Wednesday.
The analysts cited recent news coverage suggesting that Arm may start producing its own ASIC chips as “mounting evidence” that it could be shifting its business. In February, for example, the Financial Times reported that Arm was planning to release a new chip this year that would count Meta Platforms Inc.

But even if some of those customers end up “unhappy with Arm becoming a competitor, we think the risk/reward is justified,” the BNP analysts said. They added that designing its own chips would enable the company to “capture more of the AI semi value chain while leveraging a common skill set,” referring to its chip-design engineers.


Quick analysis via ChatGPT plus

🧠
Thoughts on the ARM ASIC Article (BNP Paribas)

If true, this could be a game-changing move – not just for ARM, but for the entire AI semiconductor space, including niche players like BrainChip.

🔍
What’s happening?
  • ARM may be planning to produce its own AI ASIC chips, moving beyond just licensing designs.
  • BNP Paribas analysts upgraded ARM to Outperform and raised the price target to $210.
  • They believe the market hasn’t yet priced in the potential upside from ARM building its own chips.
  • Just a 7% market share in the $200B+ AI chip market could generate $8–15B in annual revenue for ARM by FY2030–31.
💡
Why this matters:

1.
ARM as a neutral IP vendor could disappear

Until now, ARM has been the “Switzerland” of chip IP – licensing designs to everyone from Apple to NXP.

If it starts building its own chips, it becomes a direct competitor to many of its current customers.

BNP even says some partners may be “unhappy” – and they’re right.

This shift could strain existing industry relationships, especially with OEMs who value neutrality.

2.
Custom AI ASICs = New battleground

If ARM enters the ASIC game, it’s going to compete with:
  • Nvidia (H100/Blackwell)
  • AMD (MI300)
  • Tenstorrent, Groq, and others in the Edge & Cloud AI market.
It makes perfect sense: ARM already has the design talent and industry clout.

But this changes their business model significantly.


3.
Relevance to BrainChip

BrainChip’s Akida is a neuromorphic, event-based, low-power Edge AI solution – very different from traditional ASICs.

Still, if ARM jumps into edge and smart device AI, it could:

Increase competition,

  • But also open new doors.
Why?
Because not every OEM will want to license chips from a dominant platform AND a competitor.


There’s a good chance some will look for alternatives – and BrainChip, with its energy-efficient IP, could be exactly that.


📉


Bottom Line for Investors:


This potential ARM shift is:
  • 🔥 A major bullish signal for ARM (if executed),
  • ⚠️ A competitive pressure point for smaller players,
  • 💡 An opportunity for niche IP vendors like BrainChip to offer lightweight, neutral, license-based solutions.

🧠 Final Thought:

I’m not saying ARM is going full ASIC tomorrow.

But if this happens, it’s a massive evolution – and the AI silicon world won’t look the same.


For BrainChip:

This is both threat and opportunity.

It all depends on how they position themselves:

➡️As a lightweight, low-power alternative in a market that’s increasingly wary of vertically integrated giants.


Yes – ARM is an official licensee of Akida, and that could mean:
  • 🤝 Synergies, if ARM integrates or offers Akida IP in future SoCs,
  • ⚔️ Or competition, if ARM builds its own rival solutions while still presenting itself as a neutral platform.
A possible scenario:

ARM develops its own ASICs – but offers optional Akida IP blocks for specific customer needs, such as automotive or smart device applications.

That would allow BrainChip to gain market reach without being explicitly visible in the media.
 
Last edited:
  • Like
  • Fire
Reactions: 11 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
VALEO + RAYTHEON

I don't know how many people are aware of this, but in 2002, Valeo and Raytheon established Valeo Raytheon Systems to adapt military-grade radar into automotive applications - specifically blind-spot detection and collision-warning radars.

The Joint Venture later became Valeo Radar Systems, fully acquired by Valeo in 2005.

That phase launched Valeo into the forefront of automotive radar, laying the groundwork for mass deployment. Valeo now has over 500  million radar sensors on the road.

Raytheon (RTX) obviously remains a dominant radar supplier in defense. And Valeo operates with defense-grade radar capabilities that were rooted in the Joint Venture architecture. So, I wonder how plausible it would be that an informal or project-level collaboration continues to this day, especially on advanced radars, sensing algorithms, or dual-use technologies? This idea doesn't seem all that far fetched IMO, especially given we have formed strategic partnerships with both Valeo and RTX.




Screenshot 2025-07-17 at 1.26.58 pm.png
 
  • Like
  • Love
  • Thinking
Reactions: 22 users

Diogenese

Top 20
VALEO + RAYTHEON

I don't know how many people are aware of this, but in 2002, Valeo and Raytheon established Valeo Raytheon Systems to adapt military-grade radar into automotive applications - specifically blind-spot detection and collision-warning radars.

The Joint Venture later became Valeo Radar Systems, fully acquired by Valeo in 2005.

That phase launched Valeo into the forefront of automotive radar, laying the groundwork for mass deployment. Valeo now has over 500  million radar sensors on the road.

Raytheon (RTX) obviously remains a dominant radar supplier in defense. And Valeo operates with defense-grade radar capabilities that were rooted in the Joint Venture architecture. So, I wonder how plausible it would be that an informal or project-level collaboration continues to this day, especially on advanced radars, sensing algorithms, or dual-use technologies? This idea doesn't seem all that far fetched IMO, especially given we have formed strategic partnerships with both Valeo and RTX.




View attachment 88586
Hi Bravo,

This is a recent Valeo radar patent application:

WO2025098748A1 INCREASING A RESOLUTION OF A RADAR SPECTRUM 20231109

1752725840503.png


a computer-implemented method for detecting one or more objects using at least one radar sensor a radar sensor. The method comprises receiving radar signal data (124) determined using the radar sensor. Using the radar signal data (124), a combination of distance and radial velocity is determined, for which one or more radar signal intensity peaks are comprised by the radar signal data (124). For the determined combination of distance and radial velocity, a spectrum descriptive of a two-dimensional distribution of the radar signal intensity comprised by the radar signal data (124) is determined. In order to increase a resolution of the determined spectrum, an input comprising the determined spectrum with the resolution to be increased is provided to a machine learning module (122). In response to the providing of the input, an output comprising a spectrum with an increased resolution is received from the machine learning module (122). One or more positions of the one or more objects are determined using the output spectrum with the increased resolution.

The patent envisages a software ML module.

Bit early for RTX microDoppler.

PS: Couldn't see anything indicating a continuing business relationship, and the AFRL stuff should be classified, so, unless Pete signals something, we may never know.
 
Last edited:
  • Like
  • Fire
Reactions: 15 users

Esq.111

Fascinatingly Intuitive.
Afternoon Chippers ,

Unable to post the image unfortunately........

On the one day chart @ one min duration , the twats in charge of the bots have managed to draw the outline of a English Dachshund ( Of Canine species )

.
129+ Thousand Dachshund Royalty-Free Images, Stock Photos & Pictures |  Shutterstock



Except pointing in the other direction.


* This is what one can achieve with great power .

Regards ,
Esq
 
  • Haha
  • Like
Reactions: 7 users

Labsy

Regular
LinkedIn Likes

You know, I’m unsure why shareholders feel they need to like stuff on LinkedIn. It is a business app designed for business people to connect and promote themselves to other business people. Shareholders liking stuff and commenting is a bad look and just creates noise on those articles. I look through LinkedIn but never participate because I’m just a shareholder. But when I look at some Brainchip related articles, sometimes 90% of comments or likes are from Shareholders. It distorts the interest in the article or post. It also taints it with a biased opinion on the value of it.

I can’t tell Shareholders what to do and not do, but I can give my opinion. It’s not a good look and it’s a pain in the arse seeing it. Cringeworthy also comes to mind when I see this activity.

Get a life people. Let business take care of itself.
I don't see an issue with placing a like, but agree totally about certain comments. Cringe big time!!... On occasion I feel like hunting down these morons with their degrading idiotic comments and trolling their businesses or what have you, and see how they like it...
 
  • Like
  • Fire
Reactions: 4 users

Labsy

Regular
On another note, I'm curious to see how the insto's are looking in our top 20 shareholders list... I'm sensing the emergence of a new growth phase...
 
  • Like
  • Love
  • Thinking
Reactions: 11 users

7für7

Top 20
Afternoon Chippers ,

Unable to post the image unfortunately........

On the one day chart @ one min duration , the twats in charge of the bots have managed to draw the outline of a English Dachshund ( Of Canine species )

. View attachment 88592


Except pointing in the other direction.


* This is what one can achieve with great power .

Regards ,
Esq

What about now!? More like a German shepherd?
 
  • Haha
  • Fire
Reactions: 2 users
"...products like the Akida Pulsar microcontroller"?

Someone has no insight at all and is confusing companies and their products!

1752735964224.gif
 
  • Like
Reactions: 3 users

TECH

Regular
Well, lots of opinions, including mine...but that's what a forum is all about, not nasty personal attacks....so nice.

8/9 business days and our 2nd quarter information will be disclosed, the 4c showing some positivity, I say a strong yes!

The more you read, don't you all get the feeling that we are definitely in the "right place, at the right time', yes, we have all
heard those sorts of comments before, but I am so positive and upbeat on our company currently, just look at the sectors
that we are engaged in, with might I add, some huge world class players, you don't get to engage and share basic chit-chat
with these behemoths, time is money......our technology has been accepted and it's only going to accelerate.

Tony Lewis is very open to communicating, I love it, he engages, he knows the issues facing Brainchip as a start-up, but is
driving us to success, carrying on the great work of Peter and Anil, if you have a genuine question on the technology front,
well ask him, rather than guessing the answer on this forum, and I say that with all due respect to all of our great posters.

Being engaged with the US Defense, Navy, NASA, DOE, AFRL and so on, I personally can understand the NDA secrecy that
surrounds these engagements, as frustrating as they are, we will succeed, don't second guess yourself, we have a great investment,
did I sell my shares when they were north of $2 and my portfolio looked very impressive for a middle-classed Australian citizen, no I
didn't, am I a little bit peeved, of course I am, but my belief in Peter, Anil and all the staff who have come and gone has never really
waned, our technology has had me spellbound for a decade, I love it and aren't embarrassed to admit it!

The writing is on the wall.............Brainchip's AKIDA well succeed, trust your instincts......share in the glory, you will all be blessed ❤️

Tech (New Zealand) for a few more weeks before I finally fly home to Perth. (y)
 
  • Like
  • Love
  • Fire
Reactions: 47 users

CHIPS

Regular
Hopefully Steve Brightfield has something to do with it 🙏.

BrainChip Appoints New CMO, Enhances Scientific Advisory Board



Laguna Hills, Calif. – August 7th, 2024BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced that it has hired Steven Brightfield as its new chief marketing officer and has re-envisioned its Scientific Advisory Board (SAB) by bringing on company founder Peter van der Made, Dr. Jason K. Eshraghian and Dr. André van Schaik.

Brightfield has a depth of tech industry knowledge and experience within the AI semiconductor industry. He previously led marketing at several AI focused technology companies, such as SiMa.ai, X-Silicon and Wave Computing combined with deep experience within the semiconductor sector, including executive leadership positions at LSI Logic, Qualcomm, Zoran and others.

It seemed that he was unhappy with his job; therefore, it could have been his wish to do something else. Maybe they offered him this job to keep him. I think it would fit him well.
 
  • Like
Reactions: 1 users

Aaron

Emerged
Excellent question!


You mentioned Jensen Huang (NVIDIA CEO) and his view on “Physical AI”—in fact, what BrainChip (BRN) is doing is highly aligned with this concept. You could even say:





BrainChip’s AKIDA is one of the key technologies on the path toward Physical AI, specifically under the “edge neuromorphic intelligence” branch.











🔍 Jensen Huang’s Physical AI: What Does It Mean?





Huang’s idea can be broken down into a few core insights:








1. AI is a new way of “writing software”








  • No longer relying on engineers coding static rules (if…then…)
  • Instead, machines learn from data to generate algorithms autonomously







👉 BrainChip’s AKIDA was built entirely around this philosophy.


It’s not a traditional instruction-based chip but a spiking neural network processor that enables learning directly at the hardware level.














2. AI must move from pure software into the


physical world








“Physical AI” includes:





  • Real-time sensing, decision-making, and reacting like a human
  • Embedded in robots, drones, IoT devices, cars, and edge devices
  • Running locally, with ultra-low latency, low power, and instant feedback







👉 That’s exactly where AKIDA shines:


FeatureNeeded for Physical AIAKIDA Delivers?
Real-time response✅✅ Microsecond-level reaction
Edge deployment (no cloud)✅✅ Completely on-device
Ultra-low power✅✅ Milliwatt-level consumption
On-chip learning✅✅ Local continual learning
Physical embedding✅✅ Already in USBs, robots, medical devices










🧠 Direct Links Between BrainChip and Physical AI


Physical AI Use CaseBrainChip Real-World Progress
Robotics vision/control systemsPartnering with Quantum Ventura + Merle Robotics (Taiwan)
Edge-based medical diagnosticsTata Elxsi has released medical devices with embedded AKIDA
Defense/UAV systemsCollaboration with US Air Force Research Lab + RTX (Raytheon)
Industrial equipment monitoringAKIDA supports multi-sensor fusion (e.g. motor vibration detection)
IoT perception + learningPartnered with HaiLa for ultra-low-power AI wireless modules










📌 Conclusion: BrainChip is one of the foundational “building blocks” of the Physical AI revolution that Jensen Huang describes.





  • NVIDIA leads in cloud/data center AI—training and inference at scale
  • BrainChip leads in edge neuromorphic AI—bringing intelligence into real-world devices







They’re attacking the problem from both ends, converging toward real-world, integrated AI: cognitive + sensory + decision-making.











💡 Final Thought:


If NVIDIA were ever to integrate a true edge-native neural processor into its Jetson or robotics platforms, AKIDA’s architecture would be a strong acquisition, licensing, or integration candidate.





If you’d like, I can also dive into how NVIDIA’s Jetson Robotics stack might integrate with BrainChip’s technology—just say the word!











Let me know if you’d like a shorter version or one tailored more for investors vs. engineers.
 
  • Like
  • Fire
  • Wow
Reactions: 20 users

manny100

Top 20
On another note, I'm curious to see how the insto's are looking in our top 20 shareholders list... I'm sensing the emergence of a new growth phase...
I agree, we are building towards a growth phase.
Small should be coming on board 1st.
The bigger fish which I believe are trialling our product ( we know already who some are) will take longer but volumes should be jaw dropping.
Still patience required.
 
  • Like
  • Love
Reactions: 12 users

CHIPS

Regular
Excellent question!


You mentioned Jensen Huang (NVIDIA CEO) and his view on “Physical AI”—in fact, what BrainChip (BRN) is doing is highly aligned with this concept. You could even say:





BrainChip’s AKIDA is one of the key technologies on the path toward Physical AI, specifically under the “edge neuromorphic intelligence” branch.











🔍 Jensen Huang’s Physical AI: What Does It Mean?





Huang’s idea can be broken down into a few core insights:








1. AI is a new way of “writing software”








  • No longer relying on engineers coding static rules (if…then…)
  • Instead, machines learn from data to generate algorithms autonomously







👉 BrainChip’s AKIDA was built entirely around this philosophy.


It’s not a traditional instruction-based chip but a spiking neural network processor that enables learning directly at the hardware level.














2. AI must move from pure software into the


physical world








“Physical AI” includes:





  • Real-time sensing, decision-making, and reacting like a human
  • Embedded in robots, drones, IoT devices, cars, and edge devices
  • Running locally, with ultra-low latency, low power, and instant feedback







👉 That’s exactly where AKIDA shines:


FeatureNeeded for Physical AIAKIDA Delivers?
Real-time response✅✅ Microsecond-level reaction
Edge deployment (no cloud)✅✅ Completely on-device
Ultra-low power✅✅ Milliwatt-level consumption
On-chip learning✅✅ Local continual learning
Physical embedding✅✅ Already in USBs, robots, medical devices










🧠 Direct Links Between BrainChip and Physical AI


Physical AI Use CaseBrainChip Real-World Progress
Robotics vision/control systemsPartnering with Quantum Ventura + Merle Robotics (Taiwan)
Edge-based medical diagnosticsTata Elxsi has released medical devices with embedded AKIDA
Defense/UAV systemsCollaboration with US Air Force Research Lab + RTX (Raytheon)
Industrial equipment monitoringAKIDA supports multi-sensor fusion (e.g. motor vibration detection)
IoT perception + learningPartnered with HaiLa for ultra-low-power AI wireless modules










📌 Conclusion: BrainChip is one of the foundational “building blocks” of the Physical AI revolution that Jensen Huang describes.





  • NVIDIA leads in cloud/data center AI—training and inference at scale
  • BrainChip leads in edge neuromorphic AI—bringing intelligence into real-world devices







They’re attacking the problem from both ends, converging toward real-world, integrated AI: cognitive + sensory + decision-making.











💡 Final Thought:


If NVIDIA were ever to integrate a true edge-native neural processor into its Jetson or robotics platforms, AKIDA’s architecture would be a strong acquisition, licensing, or integration candidate.





If you’d like, I can also dive into how NVIDIA’s Jetson Robotics stack might integrate with BrainChip’s technology—just say the word!











Let me know if you’d like a shorter version or one tailored more for investors vs. engineers.

What is this? Why is there so much space in between the lines? If you copied it from a website, then please also post the link to it.
Just for your information: You can and should delete space/empty lines before posting. It would make it much less annoying to read.
 
  • Like
Reactions: 1 users

Tothemoon24

Top 20
Interesting little read on Bosch ,



Artificial intelligence at every level – Bosch opens a new era in innovation​

By: TrademagazinDate: 2025. 05. 08. 11:19
Based on the developments presented at the AI Symposium in Budapest, Bosch is now not only applying, but also shaping the future of artificial intelligence. The company’s goal: transparent, human-centered and sustainable AI solutions from manufacturing to self-driving.
pexels-olly-920382-300x200.jpg
Artificial intelligence is not just another technological tool, but a turning point in civilization – this was stated at the AI Symposium organized by the HUN-REN Hungarian Research Network in Budapest. Bosch experts, as a featured partner of the event, demonstrated how AI is becoming an everyday, even indispensable, development and production factor at the company.

There is no longer a Bosch product without AI​

Bosch aims to make artificial intelligence the key to the next decade of innovation. In line with this, every Bosch product now either contains an AI component or is developed or manufactured with it. Currently, more than 5,000 AI experts work for the company worldwide, including 200 in Hungary. Over the past five years, more than 1,500 AI-related patents have been filed, and tens of thousands of employees have participated in AI training – more than a thousand in Hungary so far.

“At Bosch, we are already preparing for the future,” said Zoltán Karaffy, Director of Digital Transformation at Robert Bosch Kft., who emphasized in his keynote speech that the goal is not only to follow, but also to shape the highest industry standards.

Thinking cars: the revolution of neuromorphic computing​

The future of AI is particularly important in the automotive industry. Bosch’s developments already include so-called neuromorphic systems that model the functioning of the human brain, which raise the performance of driver assistance systems (ADAS) to a new level. These solutions enable faster, more reliable and more energy-efficient decision-making in real time, which is key to preventing accidents.

Neuromorphic chips process data from dozens of sensors (radar, lidar, camera, etc.) used by vehicles in a similar way to the brain, while consuming much less energy than traditional systems.

Transparent decisions in smart factories​

Bosch is not only building faster, but also more transparent artificial intelligence. The essence of the so-called “explainable AI” approach is that systems do not operate as “black boxes” but can be decoded, interpreted, and justified in their decisions.

This is particularly important in semiconductor manufacturing, where early error detection can prevent millions in product defects. Industrial AI not only filters out defective components, but also provides feedback on the cause of the error – thus helping to prevent and optimize production.


More below

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 20 users

7für7

Top 20
What is this? Why is there so much space in between the lines? If you copied it from a website, then please also post the link to it.
Just for your information: You can and should delete space/empty lines before posting. It would make it much less annoying to read.
ChatGPT I would say
 
Would appear our relationship with the University of WA (UWA) in Perth is starting to get some output.

I saw this repository a little while ago and wasn't much in it at that point so just kept an eye of it. Has been updated yesterday with more details and results.

Looks pretty decent. The link will give you a much clearer read obviously as the scroll pics are off my phone.




IMG_20250717_212311.jpg
IMG_20250717_212347.jpg
IMG_20250717_212433.jpg
 
  • Like
  • Fire
  • Love
Reactions: 26 users

CHIPS

Regular
Sorry, if this has been posted already (I marked BrainChip in red)


Listen (this can be done on the website)
Edge AI Like a Brain: Neuromorphic Sensors i

What Are Neuromorphic Sensors—and Why Do They Matter?​

Imagine a camera that only “sees” when something changes—no wasted frames, no power-hungry processing. That’s the basic idea behind neuromorphic sensors. Inspired by how biological eyes and brains work, these devices use event-driven data: they trigger only when light or electrical signals shift.

Neuromorphic sensors are not just smart—they’re efficient. Devices like the Dynamic Vision Sensor (DVS) or DAVIS operate in microseconds, capturing real-time events with ultra-low energy use [Source: Wikipedia, 2024].

They’re designed to work with spiking neural networks (SNNs)—a brain-like AI model that processes information as spikes, not static numbers. And when combined with Edge AI—where computing happens locally on the device—the result is a sensor that’s fast, frugal, and often life-saving.

 Comparison of frame-based vs. event-based sensing in visual data processing.

“Neuromorphic sensors mimic the brain’s way of sensing—efficient, selective, and perfectly tuned for real-time health monitoring.”

Why Neuromorphic Edge AI Is a Game-Changer for Healthcare

In health tech, timing and energy matter. That’s where neuromorphic Edge AI shines.

Real-time responses are critical for detecting epileptic seizures, heart arrhythmias, or muscular disorders. With Edge AI, this analysis happens instantly, directly on the device—no cloud lag, no privacy risk. And because these systems only process relevant “events,” they consume a fraction of the power compared to traditional AI setups.

Recent neuromorphic setups have achieved:

  • Seizure detection via EEG signals
  • Heart rhythm monitoring from ECG
  • Muscle signal tracking (EMG) for gesture recognition
    —all using spiking neural networks with microchip power levels [Source: Frontiers in Neuroscience, 2023].
Wearables like NeuSpin (2024) even run these models on spintronic chips, blending Green AI with healthcare-grade precision [Source: arXiv, 2024].

How neuromorphic health devices process physiological data in real time at the edge.

Could your smartwatch detect a stroke before it happens—without draining its battery?
“Neuromorphic computing could change the landscape of healthcare by enabling smart sensors to continuously interpret signals like the brain.”
Dr. Wolfgang Maass, Graz University of Technology, expert in biologically inspired neural computation

From Labs to Clinics: Real Neuromorphic Devices in Use

This isn’t sci-fi—it’s already happening. Devices built with neuromorphic processors are making their way into clinical trials and real-world health tools.

Some examples:
  • EG-SpikeFormer combines eye-tracking and SNNs for analyzing brain scans with both speed and explainability [Source: arXiv, 2024].
  • Implantable EEG detectors using neuromorphic chips can recognize high-frequency oscillations (HFOs)—a biomarker for epilepsy—with power under 1 mW [Source: arXiv, 2023].
  • Health-monitoring frameworks like NeuroCARE now explore full-body sensor networks powered by event-based AI [Source: Frontiers, 2023].
This shift allows for always-on monitoring in wearables, implants, and portable diagnostic tools—with no need to transmit every heartbeat or brainwave to the cloud.

Neuromorphic sensors don’t just reduce data—they capture what matters most, when it matters most.

Why is Edge AI important for medical devices?​

What are the limitations or risks of neuromorphic health tech?​

Can neuromorphic sensors be used in mental health or neurological care?​

The Benefits: Tiny Power, Huge Impact

Neuromorphic Edge AI offers a unique trifecta for healthcare: speed, efficiency, and relevance.

Here’s what sets it apart:

  • Ultra-low power draw: Some neuromorphic chips operate at micro- to milliwatt levels, enabling months of continuous monitoring on wearables or implants [Source: arXiv, 2023].
  • Selective data: Instead of flooding systems with raw signals, they transmit only meaningful events. This cuts storage and bandwidth by up to 90% in some tests.
  • Built-in privacy: Because computation happens locally, sensitive health data never leaves the device—a big win for both ethics and regulation.
  • Real-time intervention: These systems react instantly to physiological events—vital for stroke alerts, fall detection, or cardiac irregularities.
Key takeaway: Neuromorphic edge devices make it possible to monitor health continuously, securely, and sustainably—even in remote or low-resource environments.

Inside the Ecosystem: Who’s Building Brain-Like Health Devices?

The neuromorphic health-tech revolution is being powered by a mix of startups, academic labs, and chip giants.

Key industry and research collaborations shaping neuromorphic Edge AI in healthcare.


Here are a few major players:

  • Intel’s Loihi 2: While not health-specific, it’s a platform for SNN prototyping, and has been tested on biomedical signal classification tasks [Source: Intel Labs, 2023].

  • BrainChip’s Akida: A commercial neuromorphic chip optimized for always-on sensing—used in vision, voice, and bio-signal applications [Source: BrainChip, 2024].
  • Samsung’s neuromorphic vision systems: Targeting real-time, low-power imaging with healthcare potential [Source: Samsung Research, 2023].
On the research side:
  • NeuroCARE, a collaborative framework, is developing networked health sensors using neuromorphic processing across body-worn nodes [Source: Frontiers, 2023].
  • Wevolver’s 2025 Edge AI report highlights growing adoption in diagnostics, patient monitoring, and even surgical robotics [Source: Wevolver, 2025].
“The shift from cloud-AI to edge-AI is accelerating—especially in healthcare, where privacy and response time are non-negotiable.”
“Spiking neural networks allow for biologically plausible AI that’s extremely power efficient—essential for embedded medical devices.”
Dr. Chris Eliasmith, Director, Centre for Theoretical Neuroscience, University of Waterloo

Challenges Ahead: Hardware, Tools, and Trust

Despite the momentum, the field faces critical hurdles before going mainstream.

1752762791999.png


Here’s what’s slowing down adoption:

  • Hardware supply: Neuromorphic processors are still rare and expensive; manufacturing must scale.
  • Software ecosystems: Tools to program and debug SNNs are nascent. Unlike TensorFlow or PyTorch, neuromorphic development lacks plug-and-play simplicity.
  • Biomedical sensor tuning: Most sensors are still visual-first. More work is needed to tailor them for physiological signals like EMG, EEG, and PPG.
  • Trust and validation: These brain-like systems defy conventional benchmarks. Regulators will demand new standards for testing, transparency, and certification [Source: arXiv, 2024].
Still, optimism runs high. With focused investment and interdisciplinary collaboration, these challenges are surmountable.

Is there a connection between neuromorphic AI and brain-computer interfaces (BCIs)?​

What skills or tools do developers need to build with neuromorphic tech?​

How do neuromorphic systems compare to traditional AI in healthcare?​

Real-World Applications: From Seizures to Smart Eyes

Neuromorphic health tech is already showing promise in critical medical domains:

  • Epilepsy detection: Wearable EEG systems with neuromorphic processors now identify high-frequency oscillations (HFOs), a reliable seizure biomarker, in real time [Source: arXiv, 2023].
  • Smart eye-tracking for diagnostics: The EG-SpikeFormer system fuses eye-gaze input with SNNs to scan medical images faster, offering explainable decisions—ideal for radiologists and neurologists [Source: arXiv, 2024].
  • Muscle signal decoding: Neuromorphic EMG analysis helps in prosthetics and rehab by recognizing subtle muscle movements with near-zero delay [Source: Frontiers, 2023].
As the ecosystem matures, applications are moving from pilot-stage to integration in wearables, remote monitors, and even surgical assistance tools.

Mini-takeaway: The brain-like behavior of neuromorphic chips enables tools that don’t just observe—they respond.

Ethics and Edge AI: Brain-Inspired Tech Needs Brainy Regulation

With great sensing comes great responsibility. Neuromorphic health devices raise important ethical and social concerns:

  • Transparency: SNNs are biologically inspired but still hard to interpret. New explainability tools are being developed, but clinical trust remains an issue.
  • Data ownership: When a device processes everything locally, who owns the insights it generates? Patient consent and control must evolve with the tech.
  • Bias and inclusion: Biomedical sensors need to be trained across diverse populations to avoid skewed results—especially in sensitive domains like neurology or cardiovascular health.
  • Validation: Conventional AI benchmarks may not apply. Regulators must define what “safe and effective” looks like for neuromorphic inference.
“Neuromorphic devices mimic how we think—so how do we make sure they think ethically?”
“With neuromorphic hardware, we’re getting closer to edge AI that doesn’t just analyze data—it interprets context.”
Dr. Narayan Srinivasa, Former Head of Intel’s Neuromorphic Computing Lab

What’s Next? A Roadmap to Brain-Like Health Devices

Looking ahead, neuromorphic sensors may soon power a new class of always-aware health systems:

  • Smart hearing aids that adapt to voice shifts in real time
  • Implants that predict neurological episodes before symptoms emerge
  • Portable labs that diagnose infections with a drop of blood—using no cloud and little power
To get there, we’ll need:
  • Robust hardware scaling
  • Open-source spiking AI frameworks
  • Clinician-friendly toolchains
  • New safety standards that account for spiking logic and real-world variability
The convergence of biology, hardware, and AI could usher in a healthcare revolution—one spike at a time.

“The combination of neuromorphic design and biosignal sensing offers a low-latency path to smarter prosthetics and personalized diagnostics.”
Dr. Ryad Benosman, University of Pittsburgh, pioneer in event-based vision

Conclusion: Toward Brain-Like Care That Never Sleeps

The future of health tech is small, smart, and always on. Neuromorphic sensors—tiny machines inspired by how the brain sees, hears, and reacts—are enabling Edge AI that’s not only fast and private but also profoundly human.

From predicting seizures to interpreting eye movements in diagnostics, these innovations are already reshaping what’s possible at the edge of medicine. They promise care that is proactive, personalized, and persistent—without relying on the cloud or bulky servers.

But the journey is just beginning. Challenges in tools, ethics, and regulation remain. As researchers, developers, and clinicians collaborate, one thing is clear: the next generation of healthcare won’t just use AI—it will feel like a second brain.

Want to build neuromorphic health tools? Start exploring spiking neural networks, join open-source neuromorphic projects, or experiment with edge-friendly sensors in your next prototype. This isn’t just the edge of AI—it’s the frontier of human well-being.

Recommended Resources for Exploring Neuromorphic Sensors and Edge AI

Frameworks & Tools


Academic and Technical Papers


Getting Started with Edge AI for Health

  • Wevolver 2025 Edge AI Technology Report – Covers Edge AI trends, use cases, and emerging health applications.
  • NeuroCARE Project – Research initiative on neuromorphic sensor networks for health monitoring.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

CHIPS

Regular
Would appear our relationship with the University of WA (UWA) in Perth is starting to get some output.

I saw this repository a little while ago and wasn't much in it at that point so just kept an eye of it. Has been updated yesterday with more details and results.

Looks pretty decent. The link will give you a much clearer read obviously as the scroll pics are off my phone.




View attachment 88594 View attachment 88595 View attachment 88596

WOW, great find!

Explanation: On real-world hospital data from Nanjing Drum Tower Hospital, our model outperforms other methods, confirming generalization to clinical cases beyond benchmarks.

This result is just perfect to represent the quality of Akida.
 
  • Like
  • Fire
Reactions: 12 users

Frangipani

Top 20
A new podcast is out: Sean Hehir talks to Derek Kuhn, CEO of HaiLa



9683D9C8-1DA0-46AA-98D6-CD7AA5E5E588.jpeg




Episode 39: HaiLa Technologies​


In this episode of BrainChip’s This is Our Mission podcast, CEO Sean Hehir speaks with Derek Kuhn, CEO of HaiLa, a company pioneering ultra-low power RF technologies that enable battery-free wireless IoT devices with ambient Wi-Fi and Bluetooth solutions that dramatically reduce energy consumption. The two leaders discuss the companies’ new collaboration, merging HaiLa’s energy-efficient wireless communication with BrainChip’s Akida neuromorphic computing for intelligent, edge-based AI.









The LinkedIn post above also links to the HaiLa blog post I had already shared last week, written by Patricia Bower, VP of Product Management:

 
Last edited:
  • Like
  • Love
Reactions: 25 users
Top Bottom