KyrieIrving
Emerged
Is the Chinese Deep Seek a threat for Akida??
Good to see Mercedes talking about this again. Hopefully no one bothers them on linkedin.View attachment 76816
The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.
That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.
This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.
For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.
In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.
View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
The darkest hour is before the sunrise
So….”by mimicking the human brain with so-called “event-based processing“.”View attachment 76816
The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.
That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.
This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.
For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.
In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.
View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.View attachment 76816
The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.
That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.
This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.
For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.
In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.
View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
NVIDIA is already having a deep red day.AI stocks getting hammered today. Get ready for a red day 🫡
Maybe so, but who is going to be the savior in the end? BrainChip and whoever uses our technology there sp price will only rise, proof is in the pudding after today.AI stocks getting hammered today. Get ready for a red day 🫡
Hoping to close out your short position today?AI stocks getting hammered today. Get ready for a red day 🫡
This DeepSeek thing is a huge development.![]()
US tech stocks steady as Nvidia shares pick up after DeepSeek shock - live updates
Shares for leading US chip firm Nvidia dropped by almost 17% on Monday after the emergence of DeepSeek stunned Silicon Valley.www.bbc.com
Sean seems very confident – I think for good reasonChat with Sean in Link
View attachment 76829
![]()
ipXchange on LinkedIn: #ces2025 #brainchip #ai #edgecomputing #neuromorphicai #innovation…
Breaking Boundaries in AI at CES 2025! 🌟 At CES 2025, we sat down with BrainChip ’s CEO, Sean Hehir, to dive into the future of AI and edge computing. From…www.linkedin.com
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.
The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.
Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.
What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:
“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)
Maybe thanks to Kristofor Carlson?
Here are some pages from the Accepted Manuscript version:
View attachment 76552
View attachment 76553
View attachment 76554
View attachment 76558
View attachment 76556
View attachment 76557
We already knew from the April 2024 version of that paper that…
And finally, here’s a close-up of the photo on page 9:
View attachment 76555
Im 113k shares deep and first buy was back in 2020 and havent sold since so nah but bad try tho.Hoping to close out your short position today?