BRN Discussion Ongoing

7für7

Top 20
Here's hoping Mate.
It's a couple of years past my personal agenda's prime time and preference but I'm past complaining about it now.
Will ride this baby either to glory or down the toilet. 🤣
From what I can see there has never been more evidence of uptake and I think it's only a matter of time till multiple use cases are revealed and the money flow starts.
The world is catching up to PVDM's vision and energy efficiency, critical edge applications and the requirements of autonomous platforms are making our solutions more attractive every day.
We've just gotta make it through the night.
The darkest hour is before the sunrise
 
  • Like
Reactions: 9 users

Tothemoon24

Top 20
IMG_0562.jpeg



In terms of logistics, a camera on the roof makes it possible to identify the location of a package in an automated way using these markers, saving time and money. Until now, the system's weakness was lighting conditions, as classic machine vision techniques that accurately locate and decode markers fail under low-light situations.

To address this problem, researchers Rafael Berral, Rafael Muñoz, Rafael Medina and Manuel J. Marín, with the Machine Vision Applications research group at the University of Cordoba, have developed a system that is able, for the first time, to detect and decode fiducial markers under difficult lighting conditions, using neural networks. The paper is published in the journal Image and Vision Computing.

"The use of neural networks in the model allows us to detect this type of marker in a more flexible way, solving the problem of lighting for all phases of the detection and decoding process," explained researcher Berral. The entire process is comprised of three steps: marker detection, corner refinement, and marker decoding, each based on a different neural network.

This is the first time that a complete solution has been given to this problem, since, as Manuel J. Marín points out, "there have been many attempts to, under situations of optimal lighting, increase speeds, for example, but the problem of low lighting, or many shadows, had not been completely addressed to improve the process."

How to train your machine vision model​

When training this model, which presents an end-to-end solution, the team created a synthetic dataset that reliably reflects the type of lighting circumstances that can be encountered when working with a marker system without ideal conditions. Once trained, "the model was tested with real-world data, some produced here internally and others as references from other previous works," the researchers indicate.

Both the artificially generated data to train the model, and those of unfavorable lighting situations in the real world, are available on an open basis. Thus, the system could be applied today "since the code has been released and it has been made possible to test the code with any image in which fiducial markers appear," recalls Rafael Muñoz.

Thanks to this work, machine vision applications have overcome a new obstacle: moving in the dark.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 17 users
Is the Chinese Deep Seek a threat for Akida??
 

Tothemoon24

Top 20
IMG_0567.jpeg



The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.

That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.

This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.

For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.

In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.

IMG_0566.jpeg
IMG_0565.jpeg
IMG_0564.jpeg
IMG_0563.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 95 users

ndefries

Regular
View attachment 76816


The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.

That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.

This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.

For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.

In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.

View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
Good to see Mercedes talking about this again. Hopefully no one bothers them on linkedin.
 
  • Like
  • Haha
  • Fire
Reactions: 24 users

Cirat

Regular
  • Like
  • Fire
  • Love
Reactions: 38 users

7für7

Top 20
View attachment 76816


The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.

That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.

This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.

For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.

In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.

View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
So….”by mimicking the human brain with so-called “event-based processing“.”

Nothing to see here
 
  • Like
  • Thinking
  • Haha
Reactions: 7 users

RobjHunt

Regular
View attachment 76816


The intelligent vehicle functionalities of the future call for pioneering new algorithms and hardware.

That’s why Mercedes-Benz is researching artificial neural networks that could radically increase the speed and energy efficiency of complex AI computations.

This could revolutionise future driving assistance and safety systems by overcoming the limits of today’s computing hardware.

For instance, conventional computing requires up to 3000 watts for advanced automated driving functions.

In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.

View attachment 76817 View attachment 76818 View attachment 76819 View attachment 76820
In future, neuromorphic computing could do this with just 300 watts, by mimicking the human brain with so-called “event-based processing“.

There’s no “so called” about it Mr Merc!

Our little nipper does just that, reeeeealy well. Go get em tiger 🤭
 
  • Like
  • Fire
  • Love
Reactions: 29 users
  • Like
  • Fire
Reactions: 5 users

genyl

Member
AI stocks getting hammered today. Get ready for a red day 🫡
 

itsol4605

Regular
AI stocks getting hammered today. Get ready for a red day 🫡
NVIDIA is already having a deep red day.

I don't think it will be red for Brainchip - quite the opposite
 
  • Like
  • Sad
  • Haha
Reactions: 7 users
AI stocks getting hammered today. Get ready for a red day 🫡
Maybe so, but who is going to be the savior in the end? BrainChip and whoever uses our technology there sp price will only rise, proof is in the pudding after today.
 
  • Like
  • Fire
Reactions: 5 users
AI stocks getting hammered today. Get ready for a red day 🫡
Hoping to close out your short position today?
 
  • Haha
  • Like
Reactions: 7 users

cosors

👀
Off topic, but maybe from general interest:

"Infineon and the BSI pave the way for a quantum-resilient future: World's first Common Criteria Certification for post-quantum cryptography algorithm on a security controller
...
The world's first certification is a milestone on the way to a quantum-safe future in our daily lives."
 
Last edited:
  • Fire
  • Love
Reactions: 4 users
  • Like
  • Haha
  • Thinking
Reactions: 18 users

Tothemoon24

Top 20
Chat with Sean in Link

IMG_0568.jpeg

 

Attachments

  • IMG_0568.jpeg
    IMG_0568.jpeg
    1,011.8 KB · Views: 69
  • Like
  • Fire
  • Love
Reactions: 44 users

itsol4605

Regular
Chat with Sean in Link

View attachment 76829
Sean seems very confident – I think for good reason
 
  • Like
Reactions: 11 users

AARONASX

Holding onto what I've got
We all know China has in the past stretched the truth about its tech so unsure about DeepSeek, how private is it too?!

Assuming some of the major players in AI that took a hit today know of Brainchip and are under NDAs, they may want to start mentioning Akida re leveraging themselves back, the advantage this time they'll have private, low power AI at the edge. IMO.
 
  • Like
  • Fire
  • Love
Reactions: 24 users

Frangipani

Top 20
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?




Here are some pages from the Accepted Manuscript version:


View attachment 76552


View attachment 76553



View attachment 76554


View attachment 76558


View attachment 76556
View attachment 76557


We already knew from the April 2024 version of that paper that…



And finally, here’s a close-up of the photo on page 9:

View attachment 76555

Just an afterthought…

Academic research utilising Akida shouldn’t generally be underestimated or dismissed as mere playtime in an ivory tower.

Some of these researchers have excellent connections to big players in the industry and/or to government agencies and sometimes even prior work experience in relevant sectors themselves - hence their recommendations would likely be given quite a bit of weight.

Take Jeff Krichmar👆🏻for example, whose 27 page (!) CV can be found on his LinkedIn profile.


Krichmar’s first job after graduating with a Bachelor in Computer Science (and before going to grad school to pursue his Master’s) was that of a software engineer at Raytheon Corporation (now RTX), working on the PATRIOT surface-to-air missile system - a position, which also saw him become a consultant to the Japanese Self-Defense Forces from 1988-1989, while deployed to Mitsubishi Heavy Industries in Nagoya (which to this day is manufacturing PATRIOT missiles for domestic use under license from RTX and Lockheed Martin).


96C4AADA-86A0-45C6-8487-C7115C5A506B.jpeg



Over the years, he has received quite a bit of funding from the defence-related sector, mostly from the US government, but also from Northrop Grumman.

FC7CCF70-D825-42B5-9738-92EA8432414D.jpeg


In 2015 he gave an invited talk at Northrop Grumman…

9584CEF5-3DDF-4B0A-B1DE-1B5C696BC27B.jpeg


… and he was co-author of a paper published in November 2016, whose first author, his then graduate student Tiffany Hwu, was a Basic Research Systems Engineer Intern with Northrop Grumman at the time. (“This work was supported by the National Science Foundation Award number 1302125 and Northrop Grumman Aerospace Systems.”)

The neuromorphic hardware used for the self-driving robot was unsurprisingly IBM’s TrueNorth, as this was then the only neuromorphic chip around - Loihi wasn’t announced until September 2017.

3DA49E8F-ECAE-4722-AEC9-5AF89BF60DC4.jpeg


One of the paper’s other co-authors was a former postdoctoral student of Krichmar’s, Nicolas Oros, who had started working for BrainChip in December 2014 - on his LinkedIn profile it says he was in fact our company’s first employee! He is also listed as co-inventor of the Low power neuromorphic voice activation system and method patent alongside Peter van der Made and Mouna Elkhatib.

Nicolas Oros left BrainChip in February 2021 and is presently a Senior Product Manager at Aicadium, “leading the development of a computer vision SaaS product for visual inspection”. I don’t think we’ve ever looked into them? 🤔

D694F852-57A2-4DB1-97A3-5192EA0B16F1.jpeg



BCB8B936-AE8C-4281-99FB-381A777CB7A8.jpeg



By the time of said paper’s publication, Jeff Krichmar had become a member of BrainChip’s Scientific Advisory Board - see this link of an April 2016 BRN presentation, courtesy of @uiux:


17112505-6D5B-421F-946B-215CCC3636F7.jpeg


As mentioned before, Kristofor Carlson is another of Jeff Krichmar’s former postdoctoral students (from 2011-2015), who co-authored a number of research papers with Jeff Krichmar and Nikil Dutt (both UC Irvine) over the years - the last one published in 2019.

In September, Kris Carlson gave a presentation on TENNs at UC Irvine, as an invited speaker at SAB 2024: From Animals to Animats - 17th International Conference on the Simulation of Adaptive Behavior.

AD3E51C7-F3EC-4F50-B076-EED4A86EC6D3.jpeg


Kris Carlson’s September 2024 conference talk on TENNs and the CARL lab’s recent video and paper featuring an E-Puck2 robot, which had an Akida PCIe Board mounted on top, as well as the additional info contained in that 22 January 2025 paper that CARL researchers had already experimented with the brand new AKD1000 M.2 form factor is ample evidence that there is continued interest in what BrainChip is doing from Jeff Krichmar’s side.

Academic researchers like him could very well be door openers to people in charge of other entities’ research that will result in meaningful revenue one day…
 

Attachments

  • 7462CB3C-9E2D-41F7-8F45-B70177B73316.jpeg
    7462CB3C-9E2D-41F7-8F45-B70177B73316.jpeg
    460.6 KB · Views: 43
Last edited:
  • Like
  • Love
  • Fire
Reactions: 35 users

genyl

Member
Hoping to close out your short position today?
Im 113k shares deep and first buy was back in 2020 and havent sold since so nah but bad try tho.
 
  • Like
Reactions: 8 users
Top Bottom