BRN Discussion Ongoing

Gies

Regular
Interesting,
Partners show their progress with AKIDA.
That will be the proof of longtime speculation

 
  • Like
  • Fire
Reactions: 18 users

manny100

Top 20
Brainchip = right place, right time. " Chips as the new oil"
Interesting read.
Talks about the Military's insatiable demand for advanced chips.
Taiwan's dominance and why its a ticking time bomb.
 
  • Like
Reactions: 9 users

7für7

Top 20
CES Brainchip Website Add

 
  • Like
Reactions: 10 users

Frangipani

Top 20
Also at the 56 min mark Steve talks about a sponsored University race using drones fitted with Akida but that has not gone public yet.

Exciting times ahead.

Well, BrainChip may not yet have gone public about said sponsored drone competition, but hey, no one should underestimate the sleuthing skills of us TSE forum members… 🕵️‍♀️ 🕵️‍♂️ 😄
(Unfortunately those investigative skills do not extend to predictions as to when we will finally see meaningful revenue and the BRN share price sustainably heading North…)

Should anyone be keen to find out more about the Raytheon Autonomous Vehicle Competition (AVC) prior to the public reveal by our company (at least I believe that is the contest our CMO was alluding to), let me refer you to two posts of mine I wrote last month:

Care to take your mind off the cap raise for a moment and read about a capstone project instead?

Raytheon has been sponsoring an annual engineering contest between universities - held across 4 different regions since 2023/2024* - called the RTX Autonomous Vehicle Competition (AVC).
* for the first couple of years it was held in the Texas Region only

While the competition rules change every year, the underlying concept remains the same: multi-disciplinary student teams are challenged to construct two autonomous vehicles each on a predefined budget (which was US$ 5000 during the last round). Either two aerial ones (UAVs) or both an aerial (UAV) and a ground one (UGV) that will need to communicate with each other and complete their tasks modelled on real world-challenges (eg a search and rescue mission) without any human intervention. This requires collaboration between students across different uni departments such as Mechanical and Aerospace Engineering, Electrical and Computer Engineering, as well as Computer Science. So team effort over the duration of roughly a year is a must. As is project management. For many of those students, participation in the AVC is actually their Senior Capstone Project.

For competition sponsor Raytheon, collaborating with universities around their company’s major hubs is a great way to find potential new employees. Multiple Raytheon staff are mentoring the competing teams of students (most of whom are in their senior year) throughout the project’s duration. Take Sylvia Traxler, for example, whom you may remember as a visitor to the BrainChip CES 2025 suite alongside two of her colleagues (it was the BrainChip LinkedIn post with their group photo that got deleted shortly after posting). An alumni of the University of South Florida (she is currently also pursuing a Masters in Artificial Intelligence at The University of Texas at Austin), she assisted her alma mater’s 2024/2025 student team as a mentor and saw them win 1st place at the East Coast AVC in April.



View attachment 92925


California-based James Cooper is another example of a Raytheon AVC university mentor, in his case assisting California State University Long Beach:

https://www.linkedin.com/posts/mobtek_2025-raytheon-west-coast-autonomous-vehicle-activity-7373316123850657792-MV4b

View attachment 92926


The 2023/2024 Raytheon AVC was basically a game of tag, in which the Unmanned Aerial Vehicles had to detect and track a target - namely their rivals’ ground vehicles - and deliver a water blast of 20 ml to activate a moisture sensor on the competing universities’ ground vehicles, while avoiding to shower their own UGV.

All ground vehicles had ArUco* markers on top, which helped the drones’ computer vision systems to identify them correctly.
* https://www.geeksforgeeks.org/computer-vision/detecting-aruco-markers-with-opencv-and-python-1/


The 2024/25 Raytheon AVC (“Mission Full Send!”) had student teams imagine a scenario of delivering aid to an injured soldier on a battlefield, which required them to first deploy a Scout UAV mapping an area to search for the wounded person and relay the coordinates of the detected landing zone - denoted by a specific ArUco marker - directly to the other UAV resp. UGV, which was then required to deliver a first aid kit to the specified area. All done autonomously without a human in the loop.
(cf. https://www.gmu.edu/news/2025-05/dr...-raytheon-autonomous-vehicle-competition-2025)

Here is a video about the 2025 West Coast finals that took place in June.



View attachment 92928


It appears another team from California Polytechnic (Cal Poly) State University San Luis Obispo has been participating in the latest Raytheon AVC challenge that seems to have been dubbed "Operation Touchdown", and that one member of the current student team, Computer Engineering Senior Gianni Schiappa (who was an Operations Systems intern with Raytheon this summer) used Akida for the fully autonomous drone system he developed:


View attachment 92927


View attachment 92929


View attachment 92930



View attachment 92931



From what Gianni Schiappa writes on LinkedIn, it almost sounds as if the “Raytheon Autonomous Drone and Rover” project were already done and dusted, although Raytheon’s AVC is conceptualised as a two semester-project, culminating in an intermural competition in (Northern hemisphere) spring. Unfortunately, he did not specify a time frame for the project. All I can say is that he was not part of the Cal Poly team that competed in the 2024/2025 AVC West Coast finals in June of this year (and neither was Akida used in their UVA, see https://digitalcommons.calpoly.edu/cgi/viewcontent.cgi?article=1933&context=mesp).

However, the accompanying photo he posted - a screenshot taken less than a month ago, possibly a captured still image of a video - appears to depict a competition, not just training. Maybe some fall semester qualifying contest?
Or is it a recent screenshot of a much older photo or video?

Anyway, nice to see Akida being used in more and more university student projects these days. After all, those young engineers will be tomorrow’s workforce and will already have gained first-hand experience in implementing neuromorphic technology by the time they enter the job market as graduates.

But it’s now up to our management to sign deals and make meaningful revenue to enable those future researchers to continue to work with our products in the years to come.


Last week, I shared my findings about the annual Raytheon Autonomous Vehicle Competition (AVC) and drew attention to the LinkedIn profile of Gianni Schiappa, a fourth year undergrad from Cal Poly San Luis Obispo, who has developed a fully autonomous drone system (into which he integrated Akida), which could successfully execute “autonomous landing on a moving platform.”

Further evidence* has now fully convinced me that Gianni Schiappa and the rest of his multi-disciplinary Cal Poly SLO team are indeed competing in the 2025/26 intermural AVC that was kicked off a few weeks ago, even though he didn’t specify any time frame for the “Raytheon Autonomous Drone and Rover - Operation Touchdown” project.
*See the first project description below referring to this season’s task of “engineering an autonomous system enabling a drone to detect, track, and land on a moving UGV”.

Turns out the Cal Poly engineering students are not the only ones cooking with the ingredient that is sometimes referred to as our secret sauce - at least one other uni team does, too: the one from UC Santa Barbara (UCSB)!

A UCSB Senior Capstone Project Team actually happened to win the 2024/25 AVC West Coast Region finals earlier this year (without the help of Akida at the time, though**), so the new cohort must surely be very keen on defending their school’s title.

While I don’t know whether all of the university teams competing in the ongoing AVC round were possibly advised by their Raytheon mentors to use Akida or whether it was their own idea to try out neuromorphic computing for computer vision tasks onboard the autonomous drone they have been challenged to design and build, it is definitely encouraging to see more and more students from universities other than those that have joined the BrainChip University AI Accelerator Program getting hands-on experience with Akida.


Meet some of the aspiring engineers that make up the UCSB team participating in the 2025/26 Raytheon AVC: Electrical Engineering Seniors Steven De Jesus, Jiahuan (Henry) Zheng and Aneesh Thakkar.



View attachment 93152


View attachment 93153



View attachment 93154 View attachment 93155



View attachment 93156

View attachment 93157


One of the people mentoring this season’s UCSB Senior Student Capstone Project Team is Adailton Nali Junior, who was a member of the UCSB team that won the AVC West Coast finals in June.

He also posted a link to yet another RTX video about the contest, which uses some scenes from the other videos I posted last week, but also contains different footage and information:


View attachment 93158








**Here’s the evidence that last season’s West Coast AVC winners did not use any neuromorphic technology for their two autonomous vehicles, just in case you were wondering…


View attachment 93159

View attachment 93160


I happened to find further info about the 2025/26 AVC earlier today, this time on the LinkedIn page of the Department of Electrical and Computer Engineering at The University of Texas at Dallas:


E9CBE497-F6A4-4135-820E-1C019F881DF7.jpeg


746B853C-0243-460D-8837-BFD967E2DB1A.jpeg


7FD75A63-8A02-4C5F-889D-9EECD4F2D810.jpeg



The UTD students’ corporate mentors from RTX are Marta Tatu, Data Scientist at Raytheon and Trey Williams, Technical Fellow, Software Engineering at Raytheon - they can be seen standing on the right in the following picture taken at the ECE UTDesign EXPO on 12 December:


ACEEE18C-5682-4C99-B4CA-118705446FEE.jpeg





The Eye on AI podcast with Steve Brightfield suggests that all competing uni teams were equipped with Akida hardware for the 2025/26 round of the AVC.
According to Aneesh Thakkar in my post tagged above, they were given AKD1000 M.2 Boards, which makes sense given that the UTD students’ poster mentions they also used a Raspberry Pi 5 with M2 HAT.


At about the 24 min mark the host is talking about wearables and in particular his "aura ring" running out of battery.
Steve says "I can't really disclose" and then goes on to talk about how we are targeting the wearable industry.

FYI: Craig S. Smith was referring to his Oura smart ring, a health tracking wearable by Finnish health technology company Oura Health Oy, founded in 2013.


The US Department of Defense / DOD (or as the Trump Administration now loves to call it, the Department of War / DOW) is actually Oura’s largest enterprise customer.

Earlier this year, an announcement about this partnership (which already commenced in 2019) and rumours about consumer data shared with Palantir (co-founded by tech billionaire Peter Thiel, who is one of Donald Trump’s longtime backers) sparked a backlash on social media - more and more users started voicing privacy concerns regarding their biometric data or expressed a general unhappiness with Oura’s connections to the military and/or the Trump Administration.

Read here, why “Oura’s Partnership with the Pentagon Is Ringing Alarm Bells for Customers” and how the company responded to the accusations:



On-device Edge AI to the rescue!
 
  • Like
  • Fire
  • Love
Reactions: 22 users

stockduck

Regular

Sorry if mentioned before, wasn`t aware of this article posted here......

"I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI)."

......

"With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~50µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy."

😲 ...what?

If it is brainchip IP?
 
  • Like
  • Love
  • Wow
Reactions: 16 users

manny100

Top 20
Its not talked about very often how Renesas is a supplier of chips for defense purposes in the US and Japan.
Space & Harsh Environment | Renesas
"...... Class-V/Q products for the defense, high-reliability (Hi-Rel), and radiation-hardened (rad-hard) and space marketplaces."
From memory Renesas taped out an AKIDA chip or a chip of the own design integrating AKIDA.
Here is an interesting article concerning the new Japanese Rail Gun designed to knock out missiles hypersonic speeds.
A quote from the article:
" The projectiles are made of a tungsten alloy and are reportedly “smart” enough to adjust their trajectory in real-time to hit maneuvering hypersonic missiles. It’s hard to verify this, as the onboard electronics must survive the intense acceleration the projectiles experience—about 30,000 G."
My bold above. A millisecond delay at 6.5 mach plus = cant hit the side of a barn.
Even the author disputes whether its possible. Obviously has not heard of Brainchip.
Not saying its AKIDA but there would not be many chips that could do the above. Maybe no other chips can?
This new Japanese weapon can neutralize the fastest machine ever created - Futura-Sciences
 
  • Like
  • Fire
Reactions: 7 users
I am thinking after reading quite a lot of articles on linked In that we are very close to positive outcomes.
Not sure if it’s just because it’s years end or just the way things are now working out but most companies are very positive about what they have achieved and accomplished.
Many of the companies have ties to BrainChip

Will 2026 be the beginning of an incredible journey to the moon and beyond or will we have to wait a little longer??

My thinking is that we are on the edge of a revolution and once we start to move it’s going to be hard not to be smiling 😊 knowing that we all just hopped of the roller coaster and on to a rocket.

All the very best to you all
Positive time ahead
 
  • Like
  • Love
  • Fire
Reactions: 16 users

Sorry if mentioned before, wasn`t aware of this article posted here......

"I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI)."

......

"With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~50µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy."

😲 ...what?

If it is brainchip IP?
Ai says brainchip isn't involved with the above, hopefully that's incorrect.
 
  • Like
Reactions: 1 users
  • Like
  • Haha
  • Fire
Reactions: 8 users

Frangipani

Top 20
While I mostly enjoyed the Eye on AI podcast with Steve Brightfield, I found it quite shocking that our CMO (!) seems to know so little about other neuromorphic processors.

Not only was he unaware of how to correctly pronounce Loihi* (from 12:02 min), he also falsely claimed that IBM’s TrueNorth and Intel’s Loihi were different from Akida because they were analog neuromorphic processors, although in fact both are fully digital.
*Loihi is Hawaiian and pronounced as “Low-ee-hee”

From 13:32 min: “The primary difference between BrainChip and the Intel and the IBM solutions was they were analog, so they truly tried to match the analog wave forms of the brain, whereas the [sic] BrainChip made a digital equivalent of the analog wave form.” This is simply not true:




8C1EB776-705C-429B-A3F7-20015B9F14E8.jpeg






87723B6D-72E8-4DB0-9FEC-215CA21FD902.jpeg
E30F8788-1440-40B5-B62C-EDBF33C600F7.jpeg





87DC7A3C-857B-4F42-9BCA-D4C0AFD1016C.jpeg
8ECCB154-E663-4C3E-A9E5-CC11F25C1F36.jpeg
 
  • Thinking
  • Wow
  • Like
Reactions: 4 users

7für7

Top 20
  • Like
  • Haha
Reactions: 2 users

manny100

Top 20
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
 
  • Like
Reactions: 3 users

entretec

Member
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
I was part of an Australia wide entrepreneurial workshop in the mid 1980's. The winning entry was a simple security system with cameras that only activated when movement was detected.
 
  • Like
Reactions: 5 users

manny100

Top 20
I was part of an Australia wide entrepreneurial workshop in the mid 1980's. The winning entry was a simple security system with cameras that only activated when movement was detected.
Did the winner do well commercially?
 
  • Like
Reactions: 1 users

Wags

Regular
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
Sorry to rain on the parade Manny, but motion activated CCTV and video has been around a very long time. I worked on putting a number of systems in senior gov establishments probably 35 yrs ago. Essentially monitoring the pixels on a screen for changes. Ha, talking about raining on parades, in those days, headlights and rain was the enemy of those systems due to reflections and bright light changes. In a static environment, those systems worked very well in the day. 10 years ago, managing instal of a much more sophisticated but still similar system, in a prison. Gosh, now at 67, Im reminiscing.
My mind boggles as to what would be possible with Akida, once adopted, a very very large market.
cheers
 
  • Like
  • Love
  • Fire
Reactions: 9 users
Recently I posted a GitHub update on Vision using TENNs and also @Frangipani posted some vacancies including a Field Application Engineer role.



In the Field App Engineer role there was a section under experience that could be interesting as putting all these bits together indicates to me a possible strong movement to something building in the video space.

  • Experience with Video Management System (VMS), Video Analytics, APIs from Milestone, Genentec, OnSSI, or others.
  • Understanding of Video Codec Technology (VLC, Intel Quick Sync Video).

The experience specifically calls out 3 larger video security players APIs (or others) which may or may not be telling.

Websites worth a look imo.


OnSSI now part of Hexagon.


 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 6 users
Recently I posted a GitHub update on Vision using TENNs and also @Frangipani posted some vacancies including a Field Application Engineer role.



In the Field App Engineer role there was a section under experience that could be interesting as putting all these bits together indicates to me a possible strong movement to something building in the video space.

  • Experience with Video Management System (VMS), Video Analytics, APIs from Milestone, Genentec, OnSSI, or others.
  • Understanding of Video Codec Technology (VLC, Intel Quick Sync Video).

The experience specifically calls out 3 larger video security players APIs (or others) which may or may not be telling.

Websites worth a look imo.


OnSSI now part of Hexagon.


Will 1 day BrainChip have another explosion in the shareprice
 

manny100

Top 20
Sorry to rain on the parade Manny, but motion activated CCTV and video has been around a very long time. I worked on putting a number of systems in senior gov establishments probably 35 yrs ago. Essentially monitoring the pixels on a screen for changes. Ha, talking about raining on parades, in those days, headlights and rain was the enemy of those systems due to reflections and bright light changes. In a static environment, those systems worked very well in the day. 10 years ago, managing instal of a much more sophisticated but still similar system, in a prison. Gosh, now at 67, Im reminiscing.
My mind boggles as to what would be possible with Akida, once adopted, a very very large market.
cheers
Ta, i was not aware that motion based video has been around for years.
I asked chat for a summary of differences between Motion activated video and Neuromorphic.

Key Differences​

FeatureMotion‑Activated VideoAkida Neuromorphic Processor
TriggerRecords when movement is detectedComputes only when inputs change (events)
PurposeSave storage/powerReal‑time AI analysis with ultra‑low power
IntelligenceNo interpretation — just recordingCan classify, predict, and learn patterns
Energy useModerate savingsExtreme efficiency (micro‑joules)
ApplicationsSecurity cameras, surveillanceEdge AI, robotics, autonomous systems
 
  • Like
  • Love
Reactions: 3 users
Will 1 day BrainChip have another explosion in the shareprice
Why would the BOD want that they are making enough now, a share price rise will only make them have to work and be on the ball
Come on mate
Stop dreaming. lol 😝
 
  • Like
Reactions: 1 users

manny100

Top 20
The link below lines up with Sean's military wearables talk.
 
Top Bottom