Frangipani
Top 20
Researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeffrey Krichmar, have been experimenting with AKD1000:
View attachment 67694
CARL: Home Page
sites.socsci.uci.edu
View attachment 67690
View attachment 67691
View attachment 67692
CARLsim: GPU-accelerated Spiking Neural Network (SNN) simulator
sites.socsci.uci.edu
View attachment 67693
![]()
GitHub - UCI-CARL/CARLsimPP
Contribute to UCI-CARL/CARLsimPP development by creating an account on GitHub.github.com
View attachment 67695
View attachment 67696
This is the paper I linked in my previous post, co-authored by Lars Niedermeier, a Zurich-based IT consultant, and the above-mentioned Jeff Krichmar from UC Irvine.
View attachment 67703
The two of them co-authored three papers in recent years, including one in 2022 with another UC Irvine professor and member of the CARL team, Nikil Dutt (https://ics.uci.edu/~dutt/) as well as Anup Das from Drexel University, whose endorsement of Akida is quoted on the BrainChip website:
View attachment 67702
View attachment 67700
View attachment 67701
Lars Niedermeier’s and Jeff Krichmar’s April 2024 publication on CARLsim++ (which does not mention Akida) ends with the following conclusion and the acknowledgement that their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -
and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)
and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.
View attachment 67699
View attachment 67704
Their use of an E-Puck robot (https://en.m.wikipedia.org/wiki/E-puck_mobile_robot) for their work reminded me of our CTO’s address at the AGM in May, during which he envisioned the following object (from 22:44 min):
“Imagine a compact device similar in size to a hockey puck that combines speech recognition, LLMs and an intelligent agent capable of controlling your home’s lighting, assisting with home repairs and much more. All without needing constant connectivity or having to worry about privacy and security concerns, a major barrier to adaptation, particularly in industrial settings.”
Possibly something in the works here?
The version the two authors were envisioning in their April 2024 paper is, however, conceptualised as being available as a cloud service:
“We plan a hybrid approach to large language models available as cloud service for processing of voice and text to speech.”
The authors gave a tutorial on CARLsim++ at NICE 2024, where our CTO Tony Lewis was also presenting. Maybe they had a fruitful discussion at that conference in La Jolla, which resulted in UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL) team experimenting with AKD1000, as evidenced in the video uploaded a couple of hours ago that I shared in my previous post?
NICE 2024 - Agenda
flagship.kip.uni-heidelberg.de
View attachment 67705
GCtronic
www.gctronic.com
View attachment 67716
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.
The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.
Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.
Radware Bot Manager Captcha
iopscience.iop.org
What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:
“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)
Maybe thanks to Kristofor Carlson?
Kristofor Carlson was a postdoc at Jeff Krichmar‘s Cognitive Robotics Lab a decade ago and co-authored a number of research papers with both Jeff Krichmar and Nikil Dutt over the years, the last one published in 2019:
View attachment 67717
View attachment 67718
Here are some pages from the Accepted Manuscript version:
We already knew from the April 2024 version of that paper that…
their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -
and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)
and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.
View attachment 67699
View attachment 67704
And finally, here’s a close-up of the photo on page 9: