BRN Discussion Ongoing

stockduck

Regular

Sorry if mentioned before, wasn`t aware of this article posted here......

"I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI)."

......

"With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~50µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy."

😲 ...what?

If it is brainchip IP?
 
  • Like
  • Love
  • Wow
Reactions: 24 users

manny100

Top 20
Its not talked about very often how Renesas is a supplier of chips for defense purposes in the US and Japan.
Space & Harsh Environment | Renesas
"...... Class-V/Q products for the defense, high-reliability (Hi-Rel), and radiation-hardened (rad-hard) and space marketplaces."
From memory Renesas taped out an AKIDA chip or a chip of the own design integrating AKIDA.
Here is an interesting article concerning the new Japanese Rail Gun designed to knock out missiles hypersonic speeds.
A quote from the article:
" The projectiles are made of a tungsten alloy and are reportedly “smart” enough to adjust their trajectory in real-time to hit maneuvering hypersonic missiles. It’s hard to verify this, as the onboard electronics must survive the intense acceleration the projectiles experience—about 30,000 G."
My bold above. A millisecond delay at 6.5 mach plus = cant hit the side of a barn.
Even the author disputes whether its possible. Obviously has not heard of Brainchip.
Not saying its AKIDA but there would not be many chips that could do the above. Maybe no other chips can?
This new Japanese weapon can neutralize the fastest machine ever created - Futura-Sciences
 
  • Like
  • Fire
Reactions: 11 users
I am thinking after reading quite a lot of articles on linked In that we are very close to positive outcomes.
Not sure if it’s just because it’s years end or just the way things are now working out but most companies are very positive about what they have achieved and accomplished.
Many of the companies have ties to BrainChip

Will 2026 be the beginning of an incredible journey to the moon and beyond or will we have to wait a little longer??

My thinking is that we are on the edge of a revolution and once we start to move it’s going to be hard not to be smiling 😊 knowing that we all just hopped of the roller coaster and on to a rocket.

All the very best to you all
Positive time ahead
 
  • Like
  • Love
  • Fire
Reactions: 25 users

Sorry if mentioned before, wasn`t aware of this article posted here......

"I’m thinking about technologies like BrainChip’s Akida, which is a digital, event-based, spiking neural network (SNN) neuromorphic processor (see Bodacious Buzz on the Brain-Boggling Neuromorphic Brain Chip Battlefront), and POLYN’s Neuromorphic Analog Signal Processing (NASP) technology, which is an analog non-spiking neuromorphic processor (see Analog Neuromorphic Processors for ASICs/SoCs Offer Microwatt Edge AI)."

......

"With their current evaluation silicon, HaiLa can run a Wi-Fi-connected temperature-and-humidity sensor at ~50µW average power. Their production chip (taping out on GlobalFoundries’ 22FDX node) is expected to achieve 5–10µW average, which is low enough to run a sensor for 15–20 years on a single CR2032 battery cell, or even operate battery-free using harvested RF energy."

😲 ...what?

If it is brainchip IP?
Ai says brainchip isn't involved with the above, hopefully that's incorrect.
 
  • Like
Reactions: 1 users
  • Like
  • Haha
  • Fire
Reactions: 9 users

Frangipani

Top 20
While I mostly enjoyed the Eye on AI podcast with Steve Brightfield, I found it quite shocking that our CMO (!) seems to know so little about other neuromorphic processors.

Not only was he unaware of how to correctly pronounce Loihi* (from 12:02 min), he also falsely claimed that IBM’s TrueNorth and Intel’s Loihi were different from Akida because they were analog neuromorphic processors, although in fact both are fully digital.
*Loihi is Hawaiian and pronounced as “Low-ee-hee”

From 13:32 min: “The primary difference between BrainChip and the Intel and the IBM solutions was they were analog, so they truly tried to match the analog wave forms of the brain, whereas the [sic] BrainChip made a digital equivalent of the analog wave form.” This is simply not true:




8C1EB776-705C-429B-A3F7-20015B9F14E8.jpeg






87723B6D-72E8-4DB0-9FEC-215CA21FD902.jpeg
E30F8788-1440-40B5-B62C-EDBF33C600F7.jpeg





87DC7A3C-857B-4F42-9BCA-D4C0AFD1016C.jpeg
8ECCB154-E663-4C3E-A9E5-CC11F25C1F36.jpeg
 
  • Like
  • Thinking
  • Wow
Reactions: 6 users

7für7

Top 20
  • Like
  • Haha
Reactions: 2 users

manny100

Top 20
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
 
  • Like
Reactions: 5 users

entretec

Member
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
I was part of an Australia wide entrepreneurial workshop in the mid 1980's. The winning entry was a simple security system with cameras that only activated when movement was detected.
 
  • Like
Reactions: 6 users

manny100

Top 20
I was part of an Australia wide entrepreneurial workshop in the mid 1980's. The winning entry was a simple security system with cameras that only activated when movement was detected.
Did the winner do well commercially?
 
  • Like
Reactions: 1 users

Wags

Regular
Event based security video is certainly gaining some traction.
Recently watched the movie Wake Up Dead Man: A Knives Out Mystery which was made and released in 2025.
There was a scene Detectives viewed and straight after that scene another person was seen on the video. It was explained by the Policeman who retrieved the footage that the cameras only activate with movement.
Three or 4 years ago you would have never seen that.
I had to laugh.
Sorry to rain on the parade Manny, but motion activated CCTV and video has been around a very long time. I worked on putting a number of systems in senior gov establishments probably 35 yrs ago. Essentially monitoring the pixels on a screen for changes. Ha, talking about raining on parades, in those days, headlights and rain was the enemy of those systems due to reflections and bright light changes. In a static environment, those systems worked very well in the day. 10 years ago, managing instal of a much more sophisticated but still similar system, in a prison. Gosh, now at 67, Im reminiscing.
My mind boggles as to what would be possible with Akida, once adopted, a very very large market.
cheers
 
  • Like
  • Love
  • Fire
Reactions: 13 users
Recently I posted a GitHub update on Vision using TENNs and also @Frangipani posted some vacancies including a Field Application Engineer role.



In the Field App Engineer role there was a section under experience that could be interesting as putting all these bits together indicates to me a possible strong movement to something building in the video space.

  • Experience with Video Management System (VMS), Video Analytics, APIs from Milestone, Genentec, OnSSI, or others.
  • Understanding of Video Codec Technology (VLC, Intel Quick Sync Video).

The experience specifically calls out 3 larger video security players APIs (or others) which may or may not be telling.

Websites worth a look imo.


OnSSI now part of Hexagon.


 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 10 users
Recently I posted a GitHub update on Vision using TENNs and also @Frangipani posted some vacancies including a Field Application Engineer role.



In the Field App Engineer role there was a section under experience that could be interesting as putting all these bits together indicates to me a possible strong movement to something building in the video space.

  • Experience with Video Management System (VMS), Video Analytics, APIs from Milestone, Genentec, OnSSI, or others.
  • Understanding of Video Codec Technology (VLC, Intel Quick Sync Video).

The experience specifically calls out 3 larger video security players APIs (or others) which may or may not be telling.

Websites worth a look imo.


OnSSI now part of Hexagon.


Will 1 day BrainChip have another explosion in the shareprice
 
  • Haha
Reactions: 1 users

manny100

Top 20
Sorry to rain on the parade Manny, but motion activated CCTV and video has been around a very long time. I worked on putting a number of systems in senior gov establishments probably 35 yrs ago. Essentially monitoring the pixels on a screen for changes. Ha, talking about raining on parades, in those days, headlights and rain was the enemy of those systems due to reflections and bright light changes. In a static environment, those systems worked very well in the day. 10 years ago, managing instal of a much more sophisticated but still similar system, in a prison. Gosh, now at 67, Im reminiscing.
My mind boggles as to what would be possible with Akida, once adopted, a very very large market.
cheers
Ta, i was not aware that motion based video has been around for years.
I asked chat for a summary of differences between Motion activated video and Neuromorphic.

Key Differences​

FeatureMotion‑Activated VideoAkida Neuromorphic Processor
TriggerRecords when movement is detectedComputes only when inputs change (events)
PurposeSave storage/powerReal‑time AI analysis with ultra‑low power
IntelligenceNo interpretation — just recordingCan classify, predict, and learn patterns
Energy useModerate savingsExtreme efficiency (micro‑joules)
ApplicationsSecurity cameras, surveillanceEdge AI, robotics, autonomous systems
 
  • Like
  • Love
Reactions: 4 users
Will 1 day BrainChip have another explosion in the shareprice
Why would the BOD want that they are making enough now, a share price rise will only make them have to work and be on the ball
Come on mate
Stop dreaming. lol 😝
 
  • Like
  • Haha
Reactions: 2 users

manny100

Top 20
  • Like
  • Love
Reactions: 3 users

7für7

Top 20
Wish you all merry Christmas, happy holidays and a happy new year! All the best! See you next year !
 
  • Like
  • Love
  • Fire
Reactions: 17 users

Frangipani

Top 20

View attachment 92893



View attachment 92901

19 November 2025 - Session 2:

“Towards an Energy-Efficient and Sustainable IIoT using Embedded Neuromorphic AI
Behrooz Azadi, Bernhard Anzengruber-Tanase, Georgios Sopidis, Michael Haslgrübler and Alois Ferscha”




View attachment 92894 View attachment 92895

View attachment 92896






View attachment 92897 View attachment 92898


View attachment 92899


View attachment 92900




I recall somebody posting the below Pro²Future poster the other day, some of whose co-authors are also co-authors of the above paper that is going to be presented at the conference in Vienna next week:



View attachment 92903

The above-mentioned IoT25 conference paper “Towards an Energy-Efficient and Sustainable IIoT using Embedded Neuromorphic AI” by Behrooz Azadi, Bernhard Anzengruber-Tanase, Georgios Sopidis, Michael Haslgrübler (all Pro2Future GmbH, Linz) and Alois Ferscha (Johannes Kepler University Linz) can now be accessed online:

“Neuromorphic hardware offers a promising alternative, with the potential for much lower energy consumption and inference latency compared to GPU-based com- puting. Throughout this study, we investigate this claim and compare the energy consumption and inference latency of BrainChip Akida and NVIDIA Orin NX. Our results indicate that the quantized model runs faster on the Akida than on the Orin for inference, with an average of 22.54 seconds compared to 181.66 seconds for 10k test samples from the MNIST dataset, respectively. Furthermore, the measured active power and, therefore, energy consumption associated with Akida during idle time are considerably lower than that of NVIDIA Orin. Additionally, our uncontrolled long-term monitoring confirms that the neuromorphic hardware consumes significantly lower energy over 25 days, with an average active power of 4.36 W compared to 9.17 W and energy consumption of 2.6 kWh compared to 5.5 kWh.”



3C62DD0C-0A5F-46E3-95E8-32A347ABBCB1.jpeg

[…]

70F49EE2-257F-4DD2-9D05-18769748048D.jpeg

[…]

1D0B96C3-7302-4F8C-9D80-097AC1161FE7.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 34 users

Frangipani

Top 20
A week ago, I shared my findings about a new ESA-Phi-Lab Sweden project called VAIAS (Validation of an AI Accelerator for Space), which is a collaboration between Frontgrade Gaisler and Rapidity Space that aims to validate the GR801 neuromorphic AI accelerator (-> Akida) for onboard autonomy in radiation-prone environments.

Today, RISE (Research Institutes of Sweden) 🇸🇪 posted on LinkedIn about the official kick-off for VAIAS and two other ESA Phi-Lab Sweden Edge AI space projects:


View attachment 93786


C5A77129-B23E-4BCC-9199-A99F9E1986BC.jpeg



7157E8E5-E04A-4AEC-8B03-B595485223AA.jpeg



003FB23F-2D5A-46CA-BFD4-F9F80DF2B7C0.jpeg
 
  • Like
  • Fire
Reactions: 14 users

Frangipani

Top 20

View attachment 91673

Michael Pendleton, Founder and CEO of the AI Cowboys, posted this earlier today:

Enter NeuroEdge: brain-inspired AI that brings enterprise-grade cybersecurity to even the smallest IoT devices, built by The AI Cowboys, at The University of Texas at San Antonio with NVIDIA & BrainChip hardware…”


731AEB5E-25AD-42B8-8159-5633AA4EA8E3.jpeg



C574D8EA-B343-4B81-BD05-13F8E314B90F.jpeg




CF6B8D67-4410-4477-BA1E-E3E13D40D691.jpeg
67B7E2C0-D9CB-4D92-956C-3A637B2E03E3.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 25 users
Top Bottom