There was a statement to that effect at the time - then it all went dark ...Hey Dio,
Or just suspecting that is the case, or was it confirmed somewhere that Akida 2 tapeout was pulled due to somebody else undertaking it?
There was a statement to that effect at the time - then it all went dark ...Hey Dio,
Or just suspecting that is the case, or was it confirmed somewhere that Akida 2 tapeout was pulled due to somebody else undertaking it?
Didn't we drop that idea so that BRN would not be seen as a competitor to any future buyers of the IP?Much of the model building was probably done in conjunction with Edge Impulse, who have developed a method of automatically classifying items added to a model, as I understand it (I may be wrong - (there's always a first time)) basically using one-shot learning from pre-existing models. This may relate to "based on our own TENNs family of models".
@TECH , you referred to Akida 2 silicon - remember when Anil announced the tapeout, and then this was quietly shelved in favour of some mysterious "other" entity doing the job?
Drum roll ... !!!
The curtains part to reveal ... ???
... any day now ...
Maybe I was rightNo podcast, things seem extremely quite on any real news from BRN ( which is normal anyway ) so are we in a blackout period and going to get our surprise after 3 years of promises . So are we in for a real price sensitive announcement any time soon? And then a release of the podcast to go with maybe the announcement.
How lucky can we be?
View attachment 73761
Tapeout does not relate to FPGAs. This was the real McCoy SoC. FPGAs have all the logic circuits pre-installed and the interconnexions can be adjusted in the field.Didn't we drop that idea so that BRN would not be seen as a competitor to any future buyers of the IP?
Why waste money on the FPGA version if there is someone else on the job? So to speak
Yeah I like line too about funeral for CNN, So he or Brainchip is pumped, What does that funeral for CNN means now @Diogenese"An impressive number of beyond SOTA models for edge inference ... based on our own TENNs family of models."
I read this as "We have a whole lot of customers lined up and working with us to adapt their, probably CNN, models to run on Akida2/TENNs."
"I think we might be able to schedule a funeral for CNN models for temporal processing."
Ubiquitous!
Why wasn’t this price sensitive I wonder ?.Maybe I was right
BrainChip on LinkedIn: BrainChip Awarded Air Force Research Laboratory Radar Development Contract
BrainChip, the world’s first commercial producer of ultra-low power, fully digital, event-based, neuromorphic AI, today announced that it was awarded a…www.linkedin.com
ONE More time for clarity and celebration … ! One step at a time !!!
View attachment 74021
He specifically mentions "CNN models".Yeah I like line too about funeral for CNN, So he or Brainchip is pumped, What does that funeral for CNN means now @Diogenese
Normally takes 24 hours to hit everyoneThis announcement is, in my view, even more significant than when we mentioned NASA or when Mercedes referenced us a few years ago. As usual, the market seems slow to fully appreciate today's suprise annoucement..
1. This announcement represents a tangible contract backed by funding, signaling a direct entry point with immediate financial impact.
2. The collaboration with AFRL can potential to lead to broader engagements with the DoD. However, realizing this opportunity depends on executing the partnership effectively.
3. Lastly, the first move has been made, we should now *fingers crossed* see more now following.
This is a pivotal step forward, both financially and strategically for Brainchip, bring on 2025
Event cameras are great when the camera is stationary, but there's a whole lot of complexity when the camera is moving. In-cabin use may be ok except for the windows, but for the exterior you'd need some clever algorithms to account for camera motion.View attachment 74035
HAW research cooperation with Mercedes-Benz AG is revolutionizing autonomous driving: Karlsruhe University of Applied Sciences (HKA)/University of Applied Sciences is further developing complex camera technologies in neuromorphic computing. New "event cameras" are at the heart of the project #EVSC the #HKA, which provide a dramatic improvement in temporal resolution. Unlike conventional cameras, event cameras dynamically perceive changes in your field of vision instead of taking an image at fixed intervals. This allows them to supply the autonomous driving system with new information within a few microseconds, while conventional image sensors are temporarily "blind".
With the project cooperation, Karlsruhe University of Applied Sciences is consistently pursuing
another step in helping to shape the future of autonomous
because Prof. Dr. Jan Bauer is sure that the use of
event cameras will significantly expand the ability of autonomous vehicles to capture the environment faster and more accurately. "And with the EVSC project, we can significantly improve the integration capability of this new camera technology into the vehicle," says the HKA researcher.
You mean clever by akidalicous?Event cameras are great when the camera is stationary, but there's a whole lot of complexity when the camera is moving. In-cabin use may be ok except for the windows, but for the exterior you'd need some clever algorithms to account for camera motion.
Finally someone with a workingMaybe I was right
.
All the work BRN team have been engage in to this point has been for this purpose....
The switch from CNN to TENNs + Akida is in our future.
A turning point in our companies future I suspect.
Me too!!Let's hope so. I was meant to be filthy rich by now!