BRN Discussion Ongoing

Kozikan

Regular
Rajiv Ranjan is a friend of mine. I will ask him what is his involvement with BRN
Yes Please Getvivekca ,
Any feedback would be much appreciated
Cheers
 
  • Like
  • Fire
Reactions: 4 users
Beemotion releasing a product. $
 

Attachments

  • IMG_4807.png
    IMG_4807.png
    215.4 KB · Views: 160
  • Like
  • Love
  • Thinking
Reactions: 21 users
IMG_4807.png
 
  • Like
  • Love
  • Thinking
Reactions: 16 users

Dougie54

Regular
Doesn’t look like Neuromorphic is involved, looks like their software ai running on the sim.
 

Dougie54

Regular
Has anyone seen this?
I couldn’t send a link so this is just a screen shot from YouTube.
 

Attachments

  • IMG_1810.png
    IMG_1810.png
    1.2 MB · Views: 149
  • Like
  • Wow
Reactions: 4 users

Rskiff

Regular
  • Fire
  • Like
Reactions: 2 users
Doesn’t look like Neuromorphic is involved, looks like their software ai running on the sim.
Hard to say really, but I thought BrainChip tech, was now "a part" of their A.I. emotion detection models (which is available also at a software level) and the mention of "Edge processing" for "privacy" at least hints at us?..
 
  • Like
  • Love
  • Thinking
Reactions: 6 users

manny100

Regular
Chartists is that a pennant breakout on the daily and pending tomorrows price action the weekly as well?
From a beginners level charter it looks possible.
BRN 23 JAN 25.png
BRN WK 23RD JAN 25.png
 
  • Like
  • Love
Reactions: 4 users
  • Like
  • Haha
Reactions: 7 users

Mt09

Regular
Hard to say really, but I thought BrainChip tech, was now "a part" of their A.I. emotion detection models (which is available also at a software level) and the mention of "Edge processing" for "privacy" at least hints at us?..
True, one of the links embedded in the LinkedIn post does mention edge processing, in with a chance perhaps..

They can run their software on a customers GPU or other device, but will be less efficient etc etc.


1737622489222.jpeg
 
  • Like
  • Love
  • Thinking
Reactions: 20 users

manny100

Regular
  • Haha
  • Like
Reactions: 8 users

Frangipani

Regular
Researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeffrey Krichmar, have been experimenting with AKD1000:








View attachment 67694


View attachment 67690


View attachment 67691

View attachment 67692


View attachment 67693



View attachment 67695





View attachment 67696

This is the paper I linked in my previous post, co-authored by Lars Niedermeier, a Zurich-based IT consultant, and the above-mentioned Jeff Krichmar from UC Irvine.


View attachment 67703

The two of them co-authored three papers in recent years, including one in 2022 with another UC Irvine professor and member of the CARL team, Nikil Dutt (https://ics.uci.edu/~dutt/) as well as Anup Das from Drexel University, whose endorsement of Akida is quoted on the BrainChip website:

View attachment 67702


View attachment 67700




View attachment 67701

Lars Niedermeier’s and Jeff Krichmar’s April 2024 publication on CARLsim++ (which does not mention Akida) ends with the following conclusion and the acknowledgement that their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -



and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)

and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.

View attachment 67699



View attachment 67704


Their use of an E-Puck robot (https://en.m.wikipedia.org/wiki/E-puck_mobile_robot) for their work reminded me of our CTO’s address at the AGM in May, during which he envisioned the following object (from 22:44 min):

“Imagine a compact device similar in size to a hockey puck that combines speech recognition, LLMs and an intelligent agent capable of controlling your home’s lighting, assisting with home repairs and much more. All without needing constant connectivity or having to worry about privacy and security concerns, a major barrier to adaptation, particularly in industrial settings.”

Possibly something in the works here?

The version the two authors were envisioning in their April 2024 paper is, however, conceptualised as being available as a cloud service:

“We plan a hybrid approach to large language models available as cloud service for processing of voice and text to speech.”


The authors gave a tutorial on CARLsim++ at NICE 2024, where our CTO Tony Lewis was also presenting. Maybe they had a fruitful discussion at that conference in La Jolla, which resulted in UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL) team experimenting with AKD1000, as evidenced in the video uploaded a couple of hours ago that I shared in my previous post?





View attachment 67705



View attachment 67716

About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?

Kristofor Carlson was a postdoc at Jeff Krichmar‘s Cognitive Robotics Lab a decade ago and co-authored a number of research papers with both Jeff Krichmar and Nikil Dutt over the years, the last one published in 2019:

View attachment 67717

View attachment 67718


Here are some pages from the Accepted Manuscript version:


DD62965A-C876-4048-9163-79D0B2745044.jpeg



76973013-2D69-4EF3-8A47-B061F3F20C8F.jpeg




D973E46F-D416-466F-A2B3-885344B9BBD6.jpeg



CAB240B9-E8E0-451F-ADD8-0E7238E2DE51.jpeg



FCAF924D-B99B-42DA-A04F-3BD48AD956F7.jpeg

72A73673-F8D9-4C56-B4C3-7E6755DC2F4A.jpeg



We already knew from the April 2024 version of that paper that…
their work was supported by the Air Force Office of Scientific Research - the funding has been going on at least since 2022 -

and a UCI Beall Applied Innovation Proof of Product Award (https://innovation.uci.edu/pop/)

and they also thank the regional NSF I-Corps (= Innovation Corps) for valuable insights.

View attachment 67699



View attachment 67704


And finally, here’s a close-up of the photo on page 9:

5735DD4E-B9B7-4348-8328-B160FABAC4E1.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 31 users
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?




Here are some pages from the Accepted Manuscript version:


View attachment 76552


View attachment 76553



View attachment 76554


View attachment 76558


View attachment 76556
View attachment 76557


We already knew from the April 2024 version of that paper that…



And finally, here’s a close-up of the photo on page 9:

View attachment 76555
Upside down Miss Jane.

SC
 
  • Haha
  • Thinking
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
Pardon my ignorance but does that mean there was a reduction in the amount of shares that are held by shorters?
Evening Boab ,

Sorry for the delayed reply , basically this is the GROSE for the day ,
The pheasant phuketrs could have taken ...shorted X stock ...returned some ( closed out ) their positions, or carried foward ... or anything Indetweewn . BULLSHITE SPORT FOR THE DEPRAVED.

* Note, and apparently these numbers may vary .... depending if thay wish ..remember to loge their position for the day , week month.

The ASX is fucking useless , as has been displayed to all when their manigement was grilled not long ago before a government enquiry.

Accenture PLC & Tata Controle Systems... both partners of BrainChips , has the contract to re do the ASX platform , as we all have witnessed, thay are fucking useless at keeping up with the times or simply complicit, leave that up to all to get a bearing on.

Back to the question at hand...this figure is the gross shorts for the day, The pheasant s may have sold then bought back...etc .. etc.

Gross for the day, not the NET ( outstanding position , Open...bent over position , as it were ). Isn’t stated.

The more confusion thay can insert into the system , the smoother it all runs apparently.


Hope this helps .


Regards,
Esq
 
  • Like
  • Fire
Reactions: 6 users

Boab

I wish I could paint like Vincent
  • Fire
Reactions: 1 users

Boab

I wish I could paint like Vincent
Evening Boab ,

Sorry for the delayed reply , basically this is the GROSE for the day ,
The pheasant phuketrs could have taken ...shorted X stock ...returned some ( closed out ) their positions, or carried foward ... or anything Indetweewn . BULLSHITE SPORT FOR THE DEPRAVED.

* Note, and apparently these numbers may vary .... depending if thay wish ..remember to loge their position for the day , week month.

The ASX is fucking useless , as has been displayed to all when their manigement was grilled not long ago before a government enquiry.

Accenture PLC & Tata Controle Systems... both partners of BrainChips , has the contract to re do the ASX platform , as we all have witnessed, thay are fucking useless at keeping up with the times or simply complicit, leave that up to all to get a bearing on.

Back to the question at hand...this figure is the gross shorts for the day, The pheasant s may have sold then bought back...etc .. etc.

Gross for the day, not the NET ( outstanding position , Open...bent over position , as it were ). Isn’t stated.

The more confusion thay can insert into the system , the smoother it all runs apparently.


Hope this helps .


Regards,
Esq
Many thanks Esq. Business as usual then eh.😩😩
 
  • Haha
Reactions: 1 users

manny100

Regular
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?




Here are some pages from the Accepted Manuscript version:


View attachment 76552


View attachment 76553



View attachment 76554


View attachment 76558


View attachment 76556
View attachment 76557


We already knew from the April 2024 version of that paper that…



And finally, here’s a close-up of the photo on page 9:

View attachment 76555
Wow, that is an awesome find Frangipani.
 
  • Like
Reactions: 2 users

Diogenese

Top 20
Anil liked this.
Hi Boab,

'sfunny - that first bit about exciting times was my reaction when I stumbled across BRN 6 years ago.

From Walter Goodwin, I traced a patent application to a company called Neu Edge, presumably Fractile's predecessor. Then I found this patent appliction in Neu Edge's name:

GB2625821A Analog neural network 20221229

1737636341495.png

An analogue neural network comprises layers connected to form an electrical circuit having an input and an output. The input receives an electrical signal corresponding to an input example, and the output corresponds to an output of the neural network. Each layer comprises at least one programmable electronic element 1, e.g. a memristor, representing a weight of the neuron. At least one non-linear element 4, e.g. a diode, implements a non-linear transfer function. At least one amplifier block 2 amplifies an output signal to prevent signal diminishment. An error element 3, e.g. a resistor or capacitor, allows measurement of the effect the whole electrical circuit has on the element. Each layer also comprises one measurement element for measuring an electrical signal across the error element, and a second measurement element for measuring a weight input of the programmable electronic element. The neural network may be trained by clamping an input signal to a signal corresponding to an input example, clamping an output to an electrical value representing a ground truth label of the input example, and once an equilibrium state is reached, using the values from the measurement elements to update the weight.

I'm going to suggestl Gelsinger have a saver on BRCHF.
 
  • Like
Reactions: 1 users
Top Bottom