BRN Discussion Ongoing

Down 4 cents from its high
What about an enquirer into that
 
  • Like
  • Fire
  • Haha
Reactions: 14 users

DK6161

Regular
Oh well. It was fun and exciting to see none the less
 
  • Fire
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Can someone please pass this on to Tony Lewis ASAP?! *

*I'm referring to the Social Media Marketing Intern. Does anyone know if they've started yet?




The US Navy is looking for solutions involving automatic target recognition and multi-object tracking from maritime helicopters, with special interest in drone swarms / multiple vessels, high-clutter sea backgrounds and bad weather. Responses are due by 31 October.

Sounds to me like it maps perfectly to Akida + Prophesee's value propositions ( i.e. ultra-low-power, event-driven vision, fast reaction, on-device/continuous learning for contested, bandwidth-limited environments).

Event-based cameras only emit changes (not full frames), so Akida processes sparse, motion-centric data (think TENNs) with very low latency and power which would be ideal where size/weight/power are constrained and the background is constantly shimmering water.

Akida + Prophesee has already been flown on low-power drones for maritime search-and-rescue style detection, demonstrating detection and tracking from moving platforms over water.




Screenshot 2025-10-07 at 3.54.54 pm.png


Navy eyes AI to track adversarial drone swarms, vessels from maritime helicopters​

Responses to a new request for information about the technology are due to the sea service by the end of October.

October 6, 2025


The Navy is looking for automatic target recognition and tracking capabilities — and particularly, those that can stalk drone swarms or multiple vessels simultaneously — to deploy in operations involving its maritime helicopters.
Companies interested in supplying such products are invited to respond to a new request for information from Naval Surface Warfare Center Crane by Oct. 31.
“The ideal background for these solutions would be from air to ocean as surface or air to air with sea state as background utilizing different sensor types. Of special interest would be Multiple Object Tracking such as multiple vessels and/or Swarms of Unmanned Systems,” officials wrote in the RFI.

“Ideally, the solutions would be capable of detection, identification, and maintaining unique track IDs within high clutter backgrounds and obscured conditions such as fog, rain and wind, rotor wash, variety of sea states, etc.,” they noted.
=
Respondents are asked to include an overview of their algorithms and approach — as well as the sorts of targets and environments they’ve been designed for or tested with. The Navy also wants details about the types of sensor data currently in use and whether companies can provide suitable labeled training datasets to apply in operations.
“Please include information on if/how targets and tracking are presented to a supervising human,” officials wrote.
The RFI did not include information about the geographic combatant commands or contemporary operations that the Navy envisions adopting these capabilities for.
Its publication comes at a time when the sea service and broader U.S. military face disruptive threats from drone swarms like those already emerging in places like Ukraine and the Middle East region, and the Trump administration wages a new “war” against alleged drug-smuggling vessels in and around the Caribbean.

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 18 users

Sotherbys01

Regular
Apologies if posted already

View attachment 91772
Look at that table cloth.....So proud of where we have evolved from !!!
 
  • Haha
  • Like
  • Love
Reactions: 11 users

HopalongPetrovski

I'm Spartacus!
Interesting to see where we close...


Edit. (This was at 4.07pm Indicated close was 22.5 cents)

View attachment 91774
Yeah, that convenient 3 million shares @22 cents tried putting the genie back in the bottle.
Which of you guy's or gal's had the lazy $660,000 just laying around under your sofa cushions?
I saw a theory that the whole thing was a p&d to allow shorter's out and to maybe reset.
The way that supply came on was very much like turning on a tap wasn't it. 🤣
Thank goodness the regulators are on the ball, asking the difficult questions and diligently applying their trading halts. 🤣
 
  • Haha
  • Like
  • Love
Reactions: 13 users
  • Like
  • Haha
Reactions: 3 users

Frangipani

Top 20
EDHPC25, the European Data Handling & Processing Conference for space applications organised by ESA, is less than three weeks away.

Turns out BrainChip will not only be exhibiting, but that Gilles Bézard and Alf Kuchenbuch will also be presenting a two-part tutorial on 13 October:



View attachment 91405


Adam Taylor from Adiuvo Engineering, who recently posted about ORA,“a comprehensive sandbox platform for edge AI deployment and benchmarking” (whose edge devices available for evaluation also include Akida) will be presenting concurrently.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-474283


Three days later, on 16 October, ESA’s Laurent Hili will be chairing a session on AI Engines & Neuromorphic, which will include a presentation on Frontgrade Gaisler’s GRAIN product line, whose first device, the Gr801 SoC - as we know - will combine Frontgrade Gaisler’s NOEL RISC-V processor and Akida.

View attachment 91406



View attachment 91408



The session’s last speaker will be AI vision expert Roland Brochard from Airbus Defense & Space Toulouse, who has been involved in the NEURAVIS proposal with us for the past 18 months or so:


https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-429615

View attachment 91409


In April, Kenneth Östberg’s poster on GRAIN had revealed what NEURAVIS actually stands for: Neuromorphic Evaluation of Ultra-low-power Rad-hard Acceleration for Vision Inferences in Space.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-459275

View attachment 91410


It appears highly likely that Roland Brochard will be giving a presentation on said NEURAVIS proposal in his session, given that a paper co-authored by 15 researchers from all consortium partners involved (Airbus Toulouse, Airbus Ottobrunn, Frontgrade Gaisler, BrainChip, Neurobus, plus ESA) was uploaded to the conference website.

This paper has the exact same title as Roland Brochard’s EDHPC25 presentation, namely “Evaluation of Neuromorphic computing technologies for very low power AI/ML applications” (which is also almost identical to the title of the original ESA ITT, except that the words “…in space” are missing, cf. my July 2024 post above):


View attachment 91411


Full paper in next post due to the upload limitation of 10 files per post…

Here is more info on the two-part tutorial Gilles Bézard and Alf Kuchenbuch will be presenting on 13 October at EDHPC25 in Elche, Spain:

[All the weird extra os under “Use cases for Neuromorphic Processing in space” are evidently bullet points gone rogue in the website layout… When I copy and paste that passage from below website, the combination “space:” followed by “o” shows up as “space😱” here on TSE 😂]



Neuromorphic AI in Space Tutorials (by BrainChip)

Introduction

In space applications, every milliwatt matters. Satellites rely on ultra-low-power chips to process data on-board, where sending everything back to Earth is often impossible or inefficient. This makes efficient machine learning deployment on embedded hardware not just useful, but essential.

The goal of the tutorials is to show how advanced AI capabilities can be packed into tiny, power-constrained devices, enabling smarter, faster, and more autonomous satellite systems.

Audience

Project managers (particularly part 1) and engineers working on embedded low-power AI applications.
Level: from beginner to expert.


What will you learn?

In part 1: Akida in Space – Bringing Autonomy, Robustness and Efficient Data Transmission to Space Vehicles, you will learn:

  • Why AI in Space?
  • Why neuromorphic AI in Space?
  • BrainChip Akida IP is the only Event-based AI on the commercial market – IP and silicon
  • Use cases for Neuromorphic Processing in space:eek: Lunar landingo Docking in spaceo Earth observationo Space Situational Awarenesso Satellite detection (use case OHB Giasaas)

In part 2: Bringing AI to the Edge – End-to-End Machine Learning Deployment on an Embedded Low-Power AI Neuromorphic HW, you will learn:

  • Fetch and prepare a dataset suitable for your target application.
  • Design and train a machine learning model using TF/Keras.
  • Apply quantization techniques to reduce model size and optimize it for embedded hardware.
  • Convert the trained model into a format compatible with Akida hardware toolchain.
  • Export the model as a C source file that can be included and compiled together with your C main application.
  • Integrate the Akida model binary into a baremetal C application running on a STM32 microcontroller.
  • Run inference on Akida ultra-low-power hardware device, demonstrating efficient on-board processing for space-constrained environments such as satellites.

Outline of hands-on tutorial
In the first tutorial, we teach why there is a paradigm shift in Space from purely deterministic classic programming to the use of low power AI in specific use cases. We show the vast improvement in capabilities coming with the use of AI in Space.

In the second tutorial, we demonstrate how to take an ML model from dataset preparation all the way to baremetal deployment on a microcontroller with a hardware accelerator. By the end, you will know how to take an ML project from concept to fully optimized embedded deployment, step by step.

Speakers:
  • Gilles Bézard (BrainChip) – Part 1 & 2
  • Alf Kuchenbuch (BrainChip) – Part 1




As you can see, Gilles Bézard and Alf Kuchenbuch will also be referring to the GIASAAS (Global Instant Satellite as a Service) concept by OHB Hellas, which I first posted about yesterday a year ago…

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-438109

… and gave an update on in May:

The revised GIASaaS (Global Instant Satellite as a Service, formerly Greek Infrastructure for Satellite as a Service) concept website by OHB Hellas is finally back online - and Akida now no longer shows up as UAV1 (see my post above) but as one of three SATs! 🛰

AKIDA, the envisioned SAT with the AKD1000 PCIe Card onboard, is slated to handle both Object Detection as well as Satellite Detection, while the planned KRIA and CORAL satellites (equipped with a Xilinx KRIA KV260 Vision AI Starter Kit resp. a Google Coral TPU Dev Board) are tasked to handle Vessel Detection, Cloud Detection and Fire Detection (for some reason OHB Hellas removed Flood Detection from the list of applications).

Please note that this is currently a concept website only.

The idea behind GIASaaS as well as mock 3D-printed versions of the yet-to-be-lauched satellites were presented at DEFEA 2025 - Defense Exhibition Athens earlier this month, as @Stable Genius had spotted on LinkedIn:







View attachment 84868 View attachment 84869 View attachment 84870

View attachment 84867

To the best of my knowledge, this is the first time BrainChip will be making a reference to GIASAAS.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 24 users

FJ-215

Regular
Yeah, that convenient 3 million shares @22 cents tried putting the genie back in the bottle.
Which of you guy's or gal's had the lazy $660,000 just laying around under your sofa cushions?
I saw a theory that the whole thing was a p&d to allow shorter's out and to maybe reset.
The way that supply came on was very much like turning on a tap wasn't it. 🤣
Thank goodness the regulators are on the ball, asking the difficult questions and diligently applying their trading halts. 🤣
Yep, 3 million shares and the owners all had a change of heart right on the bell and decided not to sell.
 
  • Like
  • Sad
  • Fire
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
  • Like
  • Fire
  • Love
Reactions: 16 users

Esq.111

Fascinatingly Intuitive.
Chippers,

The German Market is off to a flying start .


Regards,
Esq.
 
  • Like
  • Fire
Reactions: 11 users

7für7

Top 20
  • Like
  • Haha
Reactions: 3 users

Esq.111

Fascinatingly Intuitive.
Evening 7fur7 ,

Bugger if I know , though competition normally makes the old incumbent smarten up their game.

* Of note, pressently around one third to half the volume traded in our stock , and others is already traded through CBOE .

Cboe Global Markets https://share.google/RYbHP4ob0dsslRc1J


* If nothing else .....Shareholders will be able to simultaneously trade Pork Bellies , Swiss Frank's & Shares in their respective companies all on one platform.

😗

Regards,
Esq.
 
Last edited:
  • Like
  • Thinking
  • Haha
Reactions: 7 users

Frangipani

Top 20


5DCE13F6-B456-4E11-B27F-6DB9EC22BCA7.jpeg





1b4f19c8-6297-4d2e-b566-e1e2b58be1dc-jpeg.91778


We have a connection to UC Irvine via their Professor of Cognitive Sciences Jeff Krichmar, a former member of the BrainChip Scientific Advisory Board, and through recent work done at his Cognitive Anteater Robotics Laboratory (CARL) as well as through our Manager of Applied Research Kristofor Carlson, who was one of Jeff Krichmar’s postdocs a decade ago and gave a talk about BrainChip at his former workplace during the 17th International Conference on the Simulation of Adaptive Behavior (“From animals to animats”) in September 2024. It was titled “Hardware-Accelerated Perceptions at the Edge with TENNs: from Denoising to LLMs”.

About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?




Here are some pages from the Accepted Manuscript version:


View attachment 76552


View attachment 76553



View attachment 76554


View attachment 76558


View attachment 76556
View attachment 76557


We already knew from the April 2024 version of that paper that…



And finally, here’s a close-up of the photo on page 9:

View attachment 76555


Just an afterthought…

Academic research utilising Akida shouldn’t generally be underestimated or dismissed as mere playtime in an ivory tower.

Some of these researchers have excellent connections to big players in the industry and/or to government agencies and sometimes even prior work experience in relevant sectors themselves - hence their recommendations would likely be given quite a bit of weight.

Take Jeff Krichmar👆🏻for example, whose 27 page (!) CV can be found on his LinkedIn profile.


Jeff Krichmar is well-connected to the defense industry sector:

Krichmar’s first job after graduating with a Bachelor in Computer Science (and before going to grad school to pursue his Master’s) was that of a software engineer at Raytheon Corporation (now RTX), working on the PATRIOT surface-to-air missile system - a position, which also saw him become a consultant to the Japanese Self-Defense Forces from 1988-1989, while deployed to Mitsubishi Heavy Industries in Nagoya (which to this day is manufacturing PATRIOT missiles for domestic use under license from RTX and Lockheed Martin).



View attachment 76748


Over the years, he has received quite a bit of funding from the defence-related sector, mostly from the US government, but also from Northrop Grumman.

View attachment 76751

In 2015 he gave an invited talk at Northrop Grumman…

View attachment 76752

… and he was co-author of a paper published in November 2016, whose first author, his then graduate student Tiffany Hwu, was a Basic Research Systems Engineer Intern with Northrop Grumman at the time. (“This work was supported by the National Science Foundation Award number 1302125 and Northrop Grumman Aerospace Systems.”)

The neuromorphic hardware used for the self-driving robot was unsurprisingly IBM’s TrueNorth, as this was then the only neuromorphic chip around - Loihi wasn’t announced until September 2017.

View attachment 76756

One of the paper’s other co-authors was a former postdoctoral student of Krichmar’s, Nicolas Oros, who had started working for BrainChip in December 2014 - on his LinkedIn profile it says he was in fact our company’s first employee! He is also listed as co-inventor of the Low power neuromorphic voice activation system and method patent alongside Peter van der Made and Mouna Elkhatib.

Nicolas Oros left BrainChip in February 2021 and is presently a Senior Product Manager at Aicadium, “leading the development of a computer vision SaaS product for visual inspection”. I don’t think we’ve ever looked into them? 🤔

View attachment 76754


View attachment 76755


By the time of said paper’s publication, Jeff Krichmar had become a member of BrainChip’s Scientific Advisory Board - see this link of an April 2016 BRN presentation, courtesy of @uiux:


View attachment 76753

As mentioned before, Kristofor Carlson is another of Jeff Krichmar’s former postdoctoral students (from 2011-2015), who co-authored a number of research papers with Jeff Krichmar and Nikil Dutt (both UC Irvine) over the years - the last one published in 2019.

In September, Kris Carlson gave a presentation on TENNs at UC Irvine, as an invited speaker at SAB 2024: From Animals to Animats - 17th International Conference on the Simulation of Adaptive Behavior.

View attachment 76815

Kris Carlson’s September 2024 conference talk on TENNs and the CARL lab’s recent video and paper featuring an E-Puck2 robot, which had an Akida PCIe Board mounted on top, as well as the additional info contained in that 22 January 2025 paper that CARL researchers had already experimented with the brand new AKD1000 M.2 form factor is ample evidence that there is continued interest in what BrainChip is doing from Jeff Krichmar’s side.

Academic researchers like him could very well be door openers to people in charge of other entities’ research that will result in meaningful revenue one day…
 

Attachments

  • 1B4F19C8-6297-4D2E-B566-E1E2B58BE1DC.jpeg
    1B4F19C8-6297-4D2E-B566-E1E2B58BE1DC.jpeg
    177 KB · Views: 391
  • Like
  • Wow
  • Love
Reactions: 21 users

7für7

Top 20
Evening 7fur7 ,

Bugger if I know , though competition normally makes the old incumbent smarten up their game.

* Of note, pressently around one third to half the volume traded in our stock , and others is already traded through CBOE .

Cboe Global Markets https://share.google/RYbHP4ob0dsslRc1J


* If nothing else .....Shareholders will be able to simultaneously trade Pork Bellies , Swiss Frank's & Shares in their respective companies all on one platform.

😗

Regards,
Esq.

But do you think the games regarding manipulations and f…in speeding tickets would be less? I mean manipulators and Brainchip haters are everywhere… let’s see…

Have a nice evening
 
  • Like
Reactions: 1 users
Can someone please pass this on to Tony Lewis ASAP?! *

*I'm referring to the Social Media Marketing Intern. Does anyone know if they've started yet?




The US Navy is looking for solutions involving automatic target recognition and multi-object tracking from maritime helicopters, with special interest in drone swarms / multiple vessels, high-clutter sea backgrounds and bad weather. Responses are due by 31 October.

Sounds to me like it maps perfectly to Akida + Prophesee's value propositions ( i.e. ultra-low-power, event-driven vision, fast reaction, on-device/continuous learning for contested, bandwidth-limited environments).

Event-based cameras only emit changes (not full frames), so Akida processes sparse, motion-centric data (think TENNs) with very low latency and power which would be ideal where size/weight/power are constrained and the background is constantly shimmering water.

Akida + Prophesee has already been flown on low-power drones for maritime search-and-rescue style detection, demonstrating detection and tracking from moving platforms over water.




View attachment 91776

Navy eyes AI to track adversarial drone swarms, vessels from maritime helicopters​

Responses to a new request for information about the technology are due to the sea service by the end of October.

October 6, 2025


The Navy is looking for automatic target recognition and tracking capabilities — and particularly, those that can stalk drone swarms or multiple vessels simultaneously — to deploy in operations involving its maritime helicopters.
Companies interested in supplying such products are invited to respond to a new request for information from Naval Surface Warfare Center Crane by Oct. 31.
“The ideal background for these solutions would be from air to ocean as surface or air to air with sea state as background utilizing different sensor types. Of special interest would be Multiple Object Tracking such as multiple vessels and/or Swarms of Unmanned Systems,” officials wrote in the RFI.

“Ideally, the solutions would be capable of detection, identification, and maintaining unique track IDs within high clutter backgrounds and obscured conditions such as fog, rain and wind, rotor wash, variety of sea states, etc.,” they noted.
=
Respondents are asked to include an overview of their algorithms and approach — as well as the sorts of targets and environments they’ve been designed for or tested with. The Navy also wants details about the types of sensor data currently in use and whether companies can provide suitable labeled training datasets to apply in operations.
“Please include information on if/how targets and tracking are presented to a supervising human,” officials wrote.
The RFI did not include information about the geographic combatant commands or contemporary operations that the Navy envisions adopting these capabilities for.
Its publication comes at a time when the sea service and broader U.S. military face disruptive threats from drone swarms like those already emerging in places like Ukraine and the Middle East region, and the Trump administration wages a new “war” against alleged drug-smuggling vessels in and around the Caribbean.

I'll send it over now to IR.... DUN
 
Last edited:
  • Love
  • Like
Reactions: 4 users

Frangipani

Top 20


View attachment 91473

This video posted by the Global Semiconductor Alliance shows Sean Hehir and James Shields in conversation with two gentlemen at the recent GSA Executive Forum:



EF7D742C-632F-46BD-86D4-19AC69D1FEC1.jpeg



The gentleman facing the camera is Prith Banerjee, who became Senior Vice President of the Simulation and Analysis Incubation Group at Synopsys in July, after Synopsis had merged with ANSYS, a leader in engineering simulation, where Prith Banerjee had been CTO.

He moderated the event’s panel discussion on Quantum Computing:


2B61F437-FDEB-403F-ADE5-274AE1780BE7.jpeg



5A3DD2DF-D2E3-40FC-B8EE-DD8164E9A1E6.jpeg


Prith Banerjee used to work for Hewlett-Packard around the same time Sean Hehir did (both left HP in April 2012), so they possibly know each other and it may have been more of a chat between former colleagues than one about present business or future opportunities to collaborate, but who knows.


441040B0-A775-42E6-90B8-6E46C0979A32.jpeg
 
  • Like
  • Love
Reactions: 11 users

Tony Coles

Regular
And what does it mean for Brainchip for example?

Clean up all these low life shorts of the asx, which the asx work simultaneously with shorting. Scum bags!
 
  • Like
  • Fire
Reactions: 7 users

TECH

Regular
Issung shares to pay bills may not be a good look, but it's a good sign that the contractor was prepared to accept shares.

... and obviously the contractor has friends.
We have seen this playbook before and quite frankly it's a very positive sign when a contractor accepts securities in lieu of cash, and from my reckoning it's the same contractor, namely one who can spit us out as a by-product, and why...because they know to well how we snuggle up to each other to make expectational music, I mean technology together, both parties bring something special to the table and that's why a little Aussie battler was the chosen one.. standby, June 2026 is my evaluation month and we are right on point, in my humble opinion of course.

Love AKD Tech ❤️
 
  • Like
Reactions: 8 users

7für7

Top 20
Clean up all these low life shorts of the asx, which the asx work simultaneously with shorting. Scum bags!

Would be a relief!
 
  • Like
Reactions: 1 users
Top Bottom