Food4 thought
Regular
Down 4 cents from its high
What about an enquirer into that
What about an enquirer into that
Look at that table cloth.....So proud of where we have evolved from !!!Apologies if posted already
View attachment 91772![]()
#brainchip #edgeai #neuromorphiccomputing #lowpowerai #embeddedai #aihardware #fomoad #eventrecap #imagine2025 #edgeimpulse #raspberrypi #akida | BrainChip
BrainChip made an impact at Edge Impulse Imagine 2025 in Mountain View last week, where we showcased real-time anomaly detection at ultra-low power. We demonstrated Akida 2.0 FPGA running Edge Impulse’s FOMO-AD model alongside a Raspberry Pi paired with the AKD1500, bringing edge intelligence...www.linkedin.com
Yeah, that convenient 3 million shares @22 cents tried putting the genie back in the bottle.Interesting to see where we close...
Edit. (This was at 4.07pm Indicated close was 22.5 cents)
View attachment 91774
Make some Money today did youOh well. It was fun and exciting to see none the less
EDHPC25, the European Data Handling & Processing Conference for space applications organised by ESA, is less than three weeks away.
Turns out BrainChip will not only be exhibiting, but that Gilles Bézard and Alf Kuchenbuch will also be presenting a two-part tutorial on 13 October:
EDHPC 2025 - 2nd European Data Handling & Data Processing Conference
EDHPC 2025 The second European Data Handling & Data Processing Conference – EDHPC 2025 – will be held from the 13th to the 17th of October 2025 in Elche, Spain. It is organised by the European Space Agency (ESA), with the support of the local tourist office. Find the latest information on the...indico.esa.int
View attachment 91405
Adam Taylor from Adiuvo Engineering, who recently posted about ORA,“a comprehensive sandbox platform for edge AI deployment and benchmarking” (whose edge devices available for evaluation also include Akida) will be presenting concurrently.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-474283
Three days later, on 16 October, ESA’s Laurent Hili will be chairing a session on AI Engines & Neuromorphic, which will include a presentation on Frontgrade Gaisler’s GRAIN product line, whose first device, the Gr801 SoC - as we know - will combine Frontgrade Gaisler’s NOEL RISC-V processor and Akida.
View attachment 91406
View attachment 91408
The session’s last speaker will be AI vision expert Roland Brochard from Airbus Defense & Space Toulouse, who has been involved in the NEURAVIS proposal with us for the past 18 months or so:
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-429615
View attachment 91409
In April, Kenneth Östberg’s poster on GRAIN had revealed what NEURAVIS actually stands for: Neuromorphic Evaluation of Ultra-low-power Rad-hard Acceleration for Vision Inferences in Space.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-459275
View attachment 91410
It appears highly likely that Roland Brochard will be giving a presentation on said NEURAVIS proposal in his session, given that a paper co-authored by 15 researchers from all consortium partners involved (Airbus Toulouse, Airbus Ottobrunn, Frontgrade Gaisler, BrainChip, Neurobus, plus ESA) was uploaded to the conference website.
This paper has the exact same title as Roland Brochard’s EDHPC25 presentation, namely “Evaluation of Neuromorphic computing technologies for very low power AI/ML applications” (which is also almost identical to the title of the original ESA ITT, except that the words “…in space” are missing, cf. my July 2024 post above):
View attachment 91411
Full paper in next post due to the upload limitation of 10 files per post…
The revised GIASaaS (Global Instant Satellite as a Service, formerly Greek Infrastructure for Satellite as a Service) concept website by OHB Hellas is finally back online - and Akida now no longer shows up as UAV1 (see my post above) but as one of three SATs! 🛰
AKIDA, the envisioned SAT with the AKD1000 PCIe Card onboard, is slated to handle both Object Detection as well as Satellite Detection, while the planned KRIA and CORAL satellites (equipped with a Xilinx KRIA KV260 Vision AI Starter Kit resp. a Google Coral TPU Dev Board) are tasked to handle Vessel Detection, Cloud Detection and Fire Detection (for some reason OHB Hellas removed Flood Detection from the list of applications).
Please note that this is currently a concept website only.
The idea behind GIASaaS as well as mock 3D-printed versions of the yet-to-be-lauched satellites were presented at DEFEA 2025 - Defense Exhibition Athens earlier this month, as @Stable Genius had spotted on LinkedIn:
GIASaaS by OHB Hellas
giasaas.eu
View attachment 84868 View attachment 84869 View attachment 84870
View attachment 84867
Yep, 3 million shares and the owners all had a change of heart right on the bell and decided not to sell.Yeah, that convenient 3 million shares @22 cents tried putting the genie back in the bottle.
Which of you guy's or gal's had the lazy $660,000 just laying around under your sofa cushions?
I saw a theory that the whole thing was a p&d to allow shorter's out and to maybe reset.
The way that supply came on was very much like turning on a tap wasn't it.
Thank goodness the regulators are on the ball, asking the difficult questions and diligently applying their trading halts.![]()
And what does it mean for Brainchip for example?Afternoon Chippers ,
Riveting news....ASX will start to loose its power over listed Companys ,
THANK F£€K FOR THAT.
Regards,
Esq.
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.
The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.
Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.
What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:
“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)
Maybe thanks to Kristofor Carlson?
Here are some pages from the Accepted Manuscript version:
View attachment 76552
View attachment 76553
View attachment 76554
View attachment 76558
View attachment 76556
View attachment 76557
We already knew from the April 2024 version of that paper that…
And finally, here’s a close-up of the photo on page 9:
View attachment 76555
Just an afterthought…
Academic research utilising Akida shouldn’t generally be underestimated or dismissed as mere playtime in an ivory tower.
Some of these researchers have excellent connections to big players in the industry and/or to government agencies and sometimes even prior work experience in relevant sectors themselves - hence their recommendations would likely be given quite a bit of weight.
Take Jeff Krichmarfor example, whose 27 page (!) CV can be found on his LinkedIn profile.
Krichmar’s first job after graduating with a Bachelor in Computer Science (and before going to grad school to pursue his Master’s) was that of a software engineer at Raytheon Corporation (now RTX), working on the PATRIOT surface-to-air missile system - a position, which also saw him become a consultant to the Japanese Self-Defense Forces from 1988-1989, while deployed to Mitsubishi Heavy Industries in Nagoya (which to this day is manufacturing PATRIOT missiles for domestic use under license from RTX and Lockheed Martin).
View attachment 76748
Over the years, he has received quite a bit of funding from the defence-related sector, mostly from the US government, but also from Northrop Grumman.
View attachment 76751
In 2015 he gave an invited talk at Northrop Grumman…
View attachment 76752
… and he was co-author of a paper published in November 2016, whose first author, his then graduate student Tiffany Hwu, was a Basic Research Systems Engineer Intern with Northrop Grumman at the time. (“This work was supported by the National Science Foundation Award number 1302125 and Northrop Grumman Aerospace Systems.”)
The neuromorphic hardware used for the self-driving robot was unsurprisingly IBM’s TrueNorth, as this was then the only neuromorphic chip around - Loihi wasn’t announced until September 2017.
View attachment 76756
One of the paper’s other co-authors was a former postdoctoral student of Krichmar’s, Nicolas Oros, who had started working for BrainChip in December 2014 - on his LinkedIn profile it says he was in fact our company’s first employee! He is also listed as co-inventor of the Low power neuromorphic voice activation system and method patent alongside Peter van der Made and Mouna Elkhatib.
Nicolas Oros left BrainChip in February 2021 and is presently a Senior Product Manager at Aicadium, “leading the development of a computer vision SaaS product for visual inspection”. I don’t think we’ve ever looked into them?
View attachment 76754
View attachment 76755
By the time of said paper’s publication, Jeff Krichmar had become a member of BrainChip’s Scientific Advisory Board - see this link of an April 2016 BRN presentation, courtesy of @uiux:
View attachment 76753
As mentioned before, Kristofor Carlson is another of Jeff Krichmar’s former postdoctoral students (from 2011-2015), who co-authored a number of research papers with Jeff Krichmar and Nikil Dutt (both UC Irvine) over the years - the last one published in 2019.
In September, Kris Carlson gave a presentation on TENNs at UC Irvine, as an invited speaker at SAB 2024: From Animals to Animats - 17th International Conference on the Simulation of Adaptive Behavior.
View attachment 76815
Kris Carlson’s September 2024 conference talk on TENNs and the CARL lab’s recent video and paper featuring an E-Puck2 robot, which had an Akida PCIe Board mounted on top, as well as the additional info contained in that 22 January 2025 paper that CARL researchers had already experimented with the brand new AKD1000 M.2 form factor is ample evidence that there is continued interest in what BrainChip is doing from Jeff Krichmar’s side.
Academic researchers like him could very well be door openers to people in charge of other entities’ research that will result in meaningful revenue one day…
Evening 7fur7 ,
Bugger if I know , though competition normally makes the old incumbent smarten up their game.
* Of note, pressently around one third to half the volume traded in our stock , and others is already traded through CBOE .
Cboe Global Markets https://share.google/RYbHP4ob0dsslRc1J
* If nothing else .....Shareholders will be able to simultaneously trade Pork Bellies , Swiss Frank's & Shares in their respective companies all on one platform.
Regards,
Esq.
I'll send it over now to IR.... DUNCan someone please pass this on to Tony Lewis ASAP?! *
*I'm referring to the Social Media Marketing Intern. Does anyone know if they've started yet?
The US Navy is looking for solutions involving automatic target recognition and multi-object tracking from maritime helicopters, with special interest in drone swarms / multiple vessels, high-clutter sea backgrounds and bad weather. Responses are due by 31 October.
Sounds to me like it maps perfectly to Akida + Prophesee's value propositions ( i.e. ultra-low-power, event-driven vision, fast reaction, on-device/continuous learning for contested, bandwidth-limited environments).
Event-based cameras only emit changes (not full frames), so Akida processes sparse, motion-centric data (think TENNs) with very low latency and power which would be ideal where size/weight/power are constrained and the background is constantly shimmering water.
Akida + Prophesee has already been flown on low-power drones for maritime search-and-rescue style detection, demonstrating detection and tracking from moving platforms over water.
View attachment 91776
Navy eyes AI to track adversarial drone swarms, vessels from maritime helicopters
Responses to a new request for information about the technology are due to the sea service by the end of October.
October 6, 2025
The Navy is looking for automatic target recognition and tracking capabilities — and particularly, those that can stalk drone swarms or multiple vessels simultaneously — to deploy in operations involving its maritime helicopters.
Companies interested in supplying such products are invited to respond to a new request for information from Naval Surface Warfare Center Crane by Oct. 31.
“The ideal background for these solutions would be from air to ocean as surface or air to air with sea state as background utilizing different sensor types. Of special interest would be Multiple Object Tracking such as multiple vessels and/or Swarms of Unmanned Systems,” officials wrote in the RFI.
“Ideally, the solutions would be capable of detection, identification, and maintaining unique track IDs within high clutter backgrounds and obscured conditions such as fog, rain and wind, rotor wash, variety of sea states, etc.,” they noted.
=
Respondents are asked to include an overview of their algorithms and approach — as well as the sorts of targets and environments they’ve been designed for or tested with. The Navy also wants details about the types of sensor data currently in use and whether companies can provide suitable labeled training datasets to apply in operations.
“Please include information on if/how targets and tracking are presented to a supervising human,” officials wrote.
The RFI did not include information about the geographic combatant commands or contemporary operations that the Navy envisions adopting these capabilities for.
Its publication comes at a time when the sea service and broader U.S. military face disruptive threats from drone swarms like those already emerging in places like Ukraine and the Middle East region, and the Trump administration wages a new “war” against alleged drug-smuggling vessels in and around the Caribbean.
![]()
Navy eyes AI to track adversarial drone swarms, vessels from maritime helicopters
The Navy is looking for automatic target recognition and tracking capabilities to deploy in operations involving its maritime helicopters.defensescoop.com
![]()
#gsa #whereleadersmeet #semiconductor #technology #ai #globalcollaboration | Global Semiconductor Alliance
That’s officially a wrap on the 2025 GSA U.S. Executive Forum!💥 This afternoon brought together some of the most powerful voices shaping the semiconductor industry — visionaries, innovators, and executives who are leading the charge in technology, growth, and global impact. A special thank you...www.linkedin.com
View attachment 91473
And what does it mean for Brainchip for example?
We have seen this playbook before and quite frankly it's a very positive sign when a contractor accepts securities in lieu of cash, and from my reckoning it's the same contractor, namely one who can spit us out as a by-product, and why...because they know to well how we snuggle up to each other to make expectational music, I mean technology together, both parties bring something special to the table and that's why a little Aussie battler was the chosen one.. standby, June 2026 is my evaluation month and we are right on point, in my humble opinion of course.Issung shares to pay bills may not be a good look, but it's a good sign that the contractor was prepared to accept shares.
... and obviously the contractor has friends.
Clean up all these low life shorts of the asx, which the asx work simultaneously with shorting. Scum bags!