Yes Please Getvivekca ,Rajiv Ranjan is a friend of mine. I will ask him what is his involvement with BRN
Any feedback would be much appreciated
Cheers
Yes Please Getvivekca ,Rajiv Ranjan is a friend of mine. I will ask him what is his involvement with BRN
Doesn’t look like Neuromorphic is involved, looks like their software ai running on the sim.
Doesn’t look like Neuromorphic is involved, looks like their software ai running on the sim.
yes, it is 7 months old.Has anyone seen this?
I couldn’t send a link so this is just a screen shot from YouTube.
Hard to say really, but I thought BrainChip tech, was now "a part" of their A.I. emotion detection models (which is available also at a software level) and the mention of "Edge processing" for "privacy" at least hints at us?..Doesn’t look like Neuromorphic is involved, looks like their software ai running on the sim.
Definitely one of those penchant breakout thingys, in my professional opinion.Chartists is that a pennant breakout on the daily and pending tomorrows price action the weekly as well?
From a beginners level charter it looks possible.
View attachment 76550 View attachment 76551
True, one of the links embedded in the LinkedIn post does mention edge processing, in with a chance perhaps..Hard to say really, but I thought BrainChip tech, was now "a part" of their A.I. emotion detection models (which is available also at a software level) and the mention of "Edge processing" for "privacy" at least hints at us?..
I guess that confirms it.Definitely one of those penchant breakout thingys, in my professional opinion.
Researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeffrey Krichmar, have been experimenting with AKD1000:
View attachment 67694
View attachment 67690
View attachment 67691
View attachment 67692
View attachment 67693
GitHub - UCI-CARL/CARLsimPP
Contribute to UCI-CARL/CARLsimPP development by creating an account on GitHub.github.com
View attachment 67695
View attachment 67696
This is the paper I linked in my previous post, co-authored by Lars Niedermeier, a Zurich-based IT consultant, and the above-mentioned Jeff Krichmar from UC Irvine.
View attachment 67703
The two of them co-authored three papers in recent years, including one in 2022 with another UC Irvine professor and member of the CARL team, Nikil Dutt (https://ics.uci.edu/~dutt/) as well as Anup Das from Drexel University, whose endorsement of Akida is quoted on the BrainChip website:
View attachment 67702
View attachment 67700
View attachment 67701
Lars Niedermeier’s and Jeff Krichmar’s April 2024 publication on CARLsim++ (which does not mention Akida) ends with the following conclusion and the acknowledgement that their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -
and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)
and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.
View attachment 67699
View attachment 67704
Their use of an E-Puck robot (https://en.m.wikipedia.org/wiki/E-puck_mobile_robot) for their work reminded me of our CTO’s address at the AGM in May, during which he envisioned the following object (from 22:44 min):
“Imagine a compact device similar in size to a hockey puck that combines speech recognition, LLMs and an intelligent agent capable of controlling your home’s lighting, assisting with home repairs and much more. All without needing constant connectivity or having to worry about privacy and security concerns, a major barrier to adaptation, particularly in industrial settings.”
Possibly something in the works here?
The version the two authors were envisioning in their April 2024 paper is, however, conceptualised as being available as a cloud service:
“We plan a hybrid approach to large language models available as cloud service for processing of voice and text to speech.”
The authors gave a tutorial on CARLsim++ at NICE 2024, where our CTO Tony Lewis was also presenting. Maybe they had a fruitful discussion at that conference in La Jolla, which resulted in UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL) team experimenting with AKD1000, as evidenced in the video uploaded a couple of hours ago that I shared in my previous post?
View attachment 67705
GCtronic
www.gctronic.com
View attachment 67716
Kristofor Carlson was a postdoc at Jeff Krichmar‘s Cognitive Robotics Lab a decade ago and co-authored a number of research papers with both Jeff Krichmar and Nikil Dutt over the years, the last one published in 2019:
View attachment 67717
View attachment 67718
their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -
and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)
and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.
View attachment 67699
View attachment 67704
Without Brainchip AkidaBeemotion releasing a product. $
Upside down Miss Jane.About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.
The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.
Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.
What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:
“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)
Maybe thanks to Kristofor Carlson?
Here are some pages from the Accepted Manuscript version:
View attachment 76552
View attachment 76553
View attachment 76554
View attachment 76558
View attachment 76556
View attachment 76557
We already knew from the April 2024 version of that paper that…
And finally, here’s a close-up of the photo on page 9:
View attachment 76555
Evening Boab ,Pardon my ignorance but does that mean there was a reduction in the amount of shares that are held by shorters?
Many thanks Esq. Business as usual then eh.Evening Boab ,
Sorry for the delayed reply , basically this is the GROSE for the day ,
The pheasant phuketrs could have taken ...shorted X stock ...returned some ( closed out ) their positions, or carried foward ... or anything Indetweewn . BULLSHITE SPORT FOR THE DEPRAVED.
* Note, and apparently these numbers may vary .... depending if thay wish ..remember to loge their position for the day , week month.
The ASX is fucking useless , as has been displayed to all when their manigement was grilled not long ago before a government enquiry.
Accenture PLC & Tata Controle Systems... both partners of BrainChips , has the contract to re do the ASX platform , as we all have witnessed, thay are fucking useless at keeping up with the times or simply complicit, leave that up to all to get a bearing on.
Back to the question at hand...this figure is the gross shorts for the day, The pheasant s may have sold then bought back...etc .. etc.
Gross for the day, not the NET ( outstanding position , Open...bent over position , as it were ). Isn’t stated.
The more confusion thay can insert into the system , the smoother it all runs apparently.
Hope this helps .
Regards,
Esq
Wow, that is an awesome find Frangipani.About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.
The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.
Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.
What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:
“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)
Maybe thanks to Kristofor Carlson?
Here are some pages from the Accepted Manuscript version:
View attachment 76552
View attachment 76553
View attachment 76554
View attachment 76558
View attachment 76556
View attachment 76557
We already knew from the April 2024 version of that paper that…
And finally, here’s a close-up of the photo on page 9:
View attachment 76555
Hi Boab,Anil liked this.
Pat Gelsinger on LinkedIn: 5 years ago, the world of computing was a boring. Add some cores, speed up… | 52 comments
5 years ago, the world of computing was a boring. Add some cores, speed up IO, increment memory performance ... then, the AI/LLM explosiong and the world is… | 52 comments on LinkedInwww.linkedin.com