BRN Discussion Ongoing

We're getting real real close old girl 👧 ( young girl ) keep your chin up were it Normally is. Robots and brainchips are like bangers and mash or toast and Vegemite, 🍓 and cream.
Vegemite what’s that crap

1773454556674.gif



You should eat the real stuff

1773454429943.gif
 

Attachments

  • 1773454329213.gif
    1773454329213.gif
    1.5 MB · Views: 10
  • Haha
Reactions: 4 users

Diogenese

Top 20
Re the US Golden Dome program, Raytheon announced a new strategic collaboration with Amazon Web Services (AWS) in December 2025, to significantly advance satellite data processing and mission control operations.

As we all know Raytheon has partnered with BrainChip to help the US Air Force develop neuromorphic AI radar signal processing. I imagine this could be useful in future missile, satellite defence or drone detection systems, but, to be clear, there is no confirmed link as yet to the Golden Dome program.

I asked ChatGPT if drones could theoretically vplay a role in a Golden Dome–type architecture as an edge AI processor for sensors inside a system built by Raytheon and feeding data into AWS command networks and here's what it had to say.









ChatGPT's Response

Yes — drones are actually one of the most plausible platforms where something like BrainChip Holdings’s Akida processor could fit into a layered defence architecture involving primes like RTX Corporation (Raytheon) and cloud systems from Amazon Web Services.

But it helps to think of the system as three layers: the drone (edge), the defence platform, and the cloud/network.

1. The drone layer (edge AI)​

Drones often carry sensors such as:
  • small radars
  • RF receivers
  • cameras
  • acoustic sensors
These sensors generate large amounts of data. Sending all of that data back to a command centre can be inefficient or impossible in contested environments.

A low-power AI chip on the drone could:
  • classify radar or RF signals
  • detect enemy drones
  • identify missile launches or radar emitters
  • distinguish birds from drones (like the micro-doppler example)
Because drones have limited battery capacity, power-efficient AI hardware is very valuable here.

That’s the niche neuromorphic chips aim for.



2. The defence platform layer (Raytheon-type systems)​

Companies like RTX build systems such as:
  • radar networks
  • missile defence platforms
  • electronic warfare systems
  • drone defence systems
If a drone detects something important (for example an incoming missile or hostile drone swarm), it could transmit classified intelligence back to a larger defence system.

That defence system might then:
  • track the threat
  • coordinate interceptors
  • jam enemy communications
So the drone becomes part of a distributed sensing network.



3. The cloud / command layer (AWS-type infrastructure)​


Large defence architectures increasingly use cloud-style infrastructure for:
  • aggregating sensor data
  • coordinating responses
  • AI analysis across large datasets
  • battlefield command systems
AWS and similar providers offer secure infrastructure for defence networks.

In this architecture:
  • drones collect and filter data locally
  • key information is transmitted to command networks
  • the cloud aggregates information from many sensors


4. What this architecture might look like​

Drone sensor
↓
Edge AI chip (Akida-type)
↓
Threat classification
↓
Secure transmission
↓
Raytheon defence platform
↓
AWS cloud / command network

This type of architecture is sometimes called distributed sensing or sensor fusion networks.



5. Why drones are attractive for this​

Drones are increasingly used as mobile sensor platforms.

They can:
  • patrol large areas
  • monitor airspace
  • track electronic signals
  • detect threats early
Instead of relying only on fixed radar stations, militaries deploy many smaller sensing nodes.

Each node might need local AI processing.



6. The important reality check​

Even though this architecture is technically logical:
  • there is no public evidence that BrainChip is currently involved in such systems
  • defence primes typically integrate many candidate technologies during evaluation phases
  • the path from demonstration to deployment can take many years
So this remains a potential application rather than a confirmed program.



✅ Bottom line

Yes — drones could theoretically use an ultra-low-power AI chip like Akida to perform onboard signal classification, threat detection, or sensor fusion, and then pass that information into larger defence systems run by companies like RTX and cloud infrastructure from AWS.

But at present this is a plausible architecture, not a confirmed deployment.
I've always liked the idea of small helium balloons for positive/neutral buoyancy. This would give them indefinite flight time. They could be tethered behind their own front line a bit like the bird's eye camera at the cricket with optical fibre cable comms and mottled camo (blue grey), or they could be propellor driven, short range OF cable comms, longer range SDR.
 
  • Like
  • Fire
Reactions: 3 users

Newk R

Regular
Can someone post what manny and vald have on crapper re Elon musk comment, I don't have access to this following

Possibility that some of Musk’s search team could/ might soon catch onto Kevins work with Akida as they are always trying to be first out in front of the pack. $$$$$ is getting closer for Akida’s rising awareness amongst the World’s A1 giants, only a matter of time now
If you research Kev online you wil find both Kevin and Elon.
 
  • Like
Reactions: 1 users

manny100

Top 20
Hi Manny,

I think you’ve missed the point I was making. What I objected to was the language in your post presenting something as a fact when it simply isn’t at this point in time.

You wrote: "The fact is we are heading towards AI taking jobs and we own a share (BRN) that will be providing the robotics that will play a part in the transition."

I'm sorry, but that isn’t a fact - it’s an assumption at this point in time. None of us know yet whether BrainChip will ultimately play a meaningful role in robotics or the broader AI transition.

I will be the first person to say that this is what I hope and pray for (when I'm wearing my prayer helmet 😝) and that I am confident, with the right set of circumstances taking place, that we can get there. As you know I’ve been incredibly supportive of the company for years and have posted extensively in favour of the technology.

But the reality today is that BrainChip still has several challenges to meet before what you’re describing becomes a fact. I don't think I need to remind anyone that we are currently sitting around 13 cents, the company still has very limited revenue, and the market is waiting to see genuine commercial scale adoption.

There are also some near-term realities investors have to consider. Index rebalancing can put pressure on the share price, and the ongoing LDA facility means additional shares may come into the market. If meaningful revenue doesn’t arrive in the next year or so, the company may also need to consider raising capital again, which would dilute existing shareholders even further. The biggest risk IMO is commercial adoption and revenue and if those don’t materialise within a reasonable timeframe, the business will struggle.

None of that means the technology won’t ultimately succeed, but it does mean the path from promising technology to large-scale commercial success still has hurdles to overcome. I don't think that pointing that out makes me a “defeated holder” and I don't think it means I should consider selling my shares. It just means I'm being realistic about where the company currently sits.

You can absolutely believe in the long-term potential of AI and robotics (as I do) while still acknowledging that holding BRN shares today does not automatically mean we “own part of the robotics future”. My point was hat outcome still needs to be proven through real commercial traction.

As for MegaChips, we still don’t have any confirmation that they've renewed their licence or that there is any meaningful production volumes tied to their demonstrations.

So my comment wasn’t meant to be read as negativity. I wasn't trying to be a Debbie Downer, I was simply pushing back on language that suggests something speculative is a certainty. As I've said numerous times, supporting a company and being realistic about its current position are not mutually exclusive.
Hi Bravo,
I belive there will be a growing trend towards robotics and that Brainchip will play a part.
If we do not play a part in the growing robotics industry (defense and civilian/industry) then there is a huge problem with our tech.
Megachips:
" This section describes the construction of the proposed framework shown in Fig. 2. We utilized a desktop PC equipped with a GPU (Nvidia RTX3090) for updating the policies and an Akida Neural Processor SoC as a neurochip [9, 12]. The robot was controlled by the policies implemented in the neurochip. SNNs were implemented to the neurochip by a conversion executed by the MetaTF of Akida that converts the software [9, 12]. Samples were collected by the SNN policies in both the simulation tasks and the real-robot tasks since the target task is neurochip-driven robot control."
The above was from a paper written jointly by the Nara Institute of Science and Technology and Megachips published on 23rd August August 2024. Both Nara and Megachips are domiciled in Japan.
Given that Megachips began demonstrating Robots in its Showroom in September 2025 it would be impossible for them to reject AKIDA after August'24, source a new chip, test and trial it and demonstrate it 13 months later.
The paper showed the success with AKIDA is shown through three types of evidence:
(1) the robot could be controlled entirely by the Akida neurochip,
(2) the SNN policies ran correctly under Akida’s hardware constraints, and
(3) the system worked in both simulation and real‑robot tasks.

Brainchip is also connected to Robotics via Chelpis-Mirle
" The M.2 card is based on a design from BrainChip along with an agreement to purchase a significant number of AKD1000 chips for qualification and deployment. Upon completion of this phase, Chelpis is planning to increase its commitment with additional orders for the AKD1000. "

Arquimea

Brainchip Episode 40 interview with Arquimea reveals that the water safety was just the start. They are into using AKIDA for Robotics (defense and security as well) - Its quite clear from the interview. They were one of the first of our clients to run Gen2 on AKIDA Cloud. No doubt they will be interested in getting their hands on the AKIDA 2500 chip when available.
A lot of robotics talk especially towards the end.
From the 22.5 minute mark:
22.50 Several mile stones for Arquimea and Robotics (with Neuromorphic).
- Combine several different sensors using Neuromorphic.
-improve performance via benchmarking current tech and surpassing it with a new paradigm.
- from the above the most important is to significantly increase the agility and efficiency of the system.
- thinking about mobile robot with multinmodel solutions integrated with sensors, potentially developing their own sensors
- demonstrate it to prove agility and efficiency.

Steve Brightfield Interview;
The below is the text from an interview with Steve Brightfield lateish 2025. The talks about the great results Raytheon/AKIDA/US AFRL are achieving with Radar and how its applicable to Robotics and that they are talking to Roboltics companies.
"And I would imagine research is ongoing. What's around the corner or over the horizon?
One of the things that was interesting is we got a contract with Air Force research libraries to work on radar using these algorithms, right? And the results actually surprised us and they surprised the contracting agency and now we're expanding that.
And we think that we can, you know, add capabilities to radar that weren't there before.
Like, for example, radar can detect things, right? But it can't tell you what it is. Well, we can classify objects now with radar in addition to detecting them.
We can improve the tracking and the latency of these radars. But we can also make them a lot smaller, right? So it's that size weight and power.
Can I put a radar in a robot? So when it's hand has got a radar signal in it and it can basically navigate, you can paint the scene without a camera. You can use it like a camera to paint the scene and recognize and grasp things that a drone. You can fly it inside tunnels or buildings indoors. You can map out where you're going. We see this shrinking of the conventional radar technologies to really go into anything moving because it's all whether it works in the dark. And if it can replicate some of the things in vision, then, you know, you don't have to worry about rain and fog and all the issues that visual, you know, control of robots.

Yeah. And are you working with robotic companies or is this still in the research room? It's still in the research. We're working with companies that are creating components or solutions that go to the robotics companies. We are in active conversations with robotic companies today. And they're in evaluation of this, right? But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out. We'll build a little prototype. So we're doing reference designs and radar. We're also going to do this in these wearables."

My bold above.
What Steve said " But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out." is interesting.
It basically means that robotics companies don’t want to be handed a raw algorithm and told to “figure it out themselves.”
Instead, those companies want complete, working examples—hardware + software + demo applications—that show how Akida can be used in real products. Obviously its in the evaluation stage but we are doing this.
He is explaining a strategic shift in how Brainchip supports robotics and likely other industries.
Steve is talking about supplying to component makers of robots. The supply is the 'works' and not just a chip and an algorithm.
 
  • Like
  • Love
Reactions: 7 users

manny100

Top 20
Episode 40: Arquimea Technologies - YouTube
" In this episode, BrainChip’s CMO Steve Brightfield, speaks with Miguel López, Head of Robotics at Arquimea Research Center in Tenerife, Spain. The discussion explores how Arquimea is using BrainChip’s Akida™ neuromorphic technology to develop intelligent and energy-efficient robotic systems. By combining Akida with PROPHESEE event-based cameras, Arquimea created a fully neuromorphic drone solution that can detect and classify human behavior in coastal areas to enhance water safety. The conversation also highlights how Arquimea is extending these capabilities into robotics, defense, and maritime security, while leveraging Akida 2 Cloud to accelerate global collaboration."
My bold above.
A lot of talk about robotics especially towards the end.
Defense and security get mentions as well.
There is also a transcript button if you prefer to read it.
 
  • Like
Reactions: 8 users

IloveLamp

Top 20

1000021074.jpg
 
  • Fire
  • Like
  • Love
Reactions: 12 users

Rach2512

Regular

Screenshot_20260314_154005_Samsung Internet.jpg
Screenshot_20260314_154011_Samsung Internet.jpg
Screenshot_20260314_154018_Samsung Internet.jpg
Screenshot_20260314_154031_Samsung Internet.jpg
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Diogenese

Top 20
Hi Bravo,
I belive there will be a growing trend towards robotics and that Brainchip will play a part.
If we do not play a part in the growing robotics industry (defense and civilian/industry) then there is a huge problem with our tech.
Megachips:
" This section describes the construction of the proposed framework shown in Fig. 2. We utilized a desktop PC equipped with a GPU (Nvidia RTX3090) for updating the policies and an Akida Neural Processor SoC as a neurochip [9, 12]. The robot was controlled by the policies implemented in the neurochip. SNNs were implemented to the neurochip by a conversion executed by the MetaTF of Akida that converts the software [9, 12]. Samples were collected by the SNN policies in both the simulation tasks and the real-robot tasks since the target task is neurochip-driven robot control."
The above was from a paper written jointly by the Nara Institute of Science and Technology and Megachips published on 23rd August August 2024. Both Nara and Megachips are domiciled in Japan.
Given that Megachips began demonstrating Robots in its Showroom in September 2025 it would be impossible for them to reject AKIDA after August'24, source a new chip, test and trial it and demonstrate it 13 months later.
The paper showed the success with AKIDA is shown through three types of evidence:
(1) the robot could be controlled entirely by the Akida neurochip,
(2) the SNN policies ran correctly under Akida’s hardware constraints, and
(3) the system worked in both simulation and real‑robot tasks.

Brainchip is also connected to Robotics via Chelpis-Mirle
" The M.2 card is based on a design from BrainChip along with an agreement to purchase a significant number of AKD1000 chips for qualification and deployment. Upon completion of this phase, Chelpis is planning to increase its commitment with additional orders for the AKD1000. "

Arquimea

Brainchip Episode 40 interview with Arquimea reveals that the water safety was just the start. They are into using AKIDA for Robotics (defense and security as well) - Its quite clear from the interview. They were one of the first of our clients to run Gen2 on AKIDA Cloud. No doubt they will be interested in getting their hands on the AKIDA 2500 chip when available.
A lot of robotics talk especially towards the end.
From the 22.5 minute mark:
22.50 Several mile stones for Arquimea and Robotics (with Neuromorphic).
- Combine several different sensors using Neuromorphic.
-improve performance via benchmarking current tech and surpassing it with a new paradigm.
- from the above the most important is to significantly increase the agility and efficiency of the system.
- thinking about mobile robot with multinmodel solutions integrated with sensors, potentially developing their own sensors
- demonstrate it to prove agility and efficiency.

Steve Brightfield Interview;
The below is the text from an interview with Steve Brightfield lateish 2025. The talks about the great results Raytheon/AKIDA/US AFRL are achieving with Radar and how its applicable to Robotics and that they are talking to Roboltics companies.
"And I would imagine research is ongoing. What's around the corner or over the horizon?
One of the things that was interesting is we got a contract with Air Force research libraries to work on radar using these algorithms, right? And the results actually surprised us and they surprised the contracting agency and now we're expanding that.
And we think that we can, you know, add capabilities to radar that weren't there before.
Like, for example, radar can detect things, right? But it can't tell you what it is. Well, we can classify objects now with radar in addition to detecting them.
We can improve the tracking and the latency of these radars. But we can also make them a lot smaller, right? So it's that size weight and power.
Can I put a radar in a robot? So when it's hand has got a radar signal in it and it can basically navigate, you can paint the scene without a camera. You can use it like a camera to paint the scene and recognize and grasp things that a drone. You can fly it inside tunnels or buildings indoors. You can map out where you're going. We see this shrinking of the conventional radar technologies to really go into anything moving because it's all whether it works in the dark. And if it can replicate some of the things in vision, then, you know, you don't have to worry about rain and fog and all the issues that visual, you know, control of robots.

Yeah. And are you working with robotic companies or is this still in the research room? It's still in the research. We're working with companies that are creating components or solutions that go to the robotics companies. We are in active conversations with robotic companies today. And they're in evaluation of this, right? But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out. We'll build a little prototype. So we're doing reference designs and radar. We're also going to do this in these wearables."

My bold above.
What Steve said " But what we decided was to create reference platforms that demonstrate these more holy rather than having a, you know, here's the algorithm go figured out." is interesting.
It basically means that robotics companies don’t want to be handed a raw algorithm and told to “figure it out themselves.”
Instead, those companies want complete, working examples—hardware + software + demo applications—that show how Akida can be used in real products. Obviously its in the evaluation stage but we are doing this.
He is explaining a strategic shift in how Brainchip supports robotics and likely other industries.
Steve is talking about supplying to component makers of robots. The supply is the 'works' and not just a chip and an algorithm.

Hi Manny,

The use of the term "shrinking of the conventional radar technologies" is interesting. The size of electronically steerable radar antennas is related to the wavelength of the radar signal, so "shrinking" may imply shorter wavelengths. This is also consistent with the concept of microDoppler - the shorter the wavelength, the finer the resolution.

https://en.wikipedia.org/wiki/Mmwave_sensing

To process microDoppler signals, latency must be reduced to a minimum in the receiving processor, because you need to detect not merely the radar frequency but modulation of that frequency, hence Akida.

However, if the "target" has no moving parts, there will be no Doppler in the reflected signal, so what Steve says implies that additional information is extracted from the robot radar mounted in the robot hand which is used to "paint" the scene. Doppler is a measure of relative motion between source and target, ie either or both can be in motion as long as there is relative motion between them.

I wonder if this means that movement/oscillation of the hand produces a Doppler effect which can be analysed by Akida somewhat analogously to lidar?

I guess we'll have to wait for the patent.
 
  • Like
  • Fire
  • Love
Reactions: 7 users

Rach2512

Regular
  • Like
  • Love
Reactions: 8 users

Rach2512

Regular
Kevin, we love you.
Screenshot_20260314_183913_Samsung Internet.jpg
Screenshot_20260314_183921_Samsung Internet.jpg


 
  • Like
  • Fire
Reactions: 3 users
Top Bottom