BRN Discussion Ongoing

Are we not an IP company? Those companies produce chips. Maybe that’s why we’re not on the list 🤔
2. BrainChip BrainChip is a neuromorphic computing company that has developed the Akida chip, which supports both SNNs and convolutional neural networks (CNNs). The Akida chip is designed for edge AI applications and features a highly efficient architecture that allows it to perform tasks like embedded vision and audio processing while consuming minimal power.
 
  • Like
Reactions: 4 users

Diogenese

Top 20
Couldn't find this conference paper from 2024 posted but maybe my search didn't capture it.

Anyway, was a positive presentation at a IEEE conference and below are a couple of snips and as they advise, "This is an accepted manuscript version of a paper before final publisher editing and formatting. Archived with thanks to IEEE."

HERE

As the authors acknowledge:

"This work has been partially supported by King Abdullah University of Science and Technology CRG program under grant number: URF/1/4704-01-01.

We would like also to thank Edge Impulse and Brainchip companies for providing us with the software tools and hardware platform used during this work.*

One of the authors, M E Fouda, caught my eye given his/her relationship with "3", maybe employer...hmmmm.

D. A. Silva1
, A. Shymyrbay1
, K. Smagulova1
, A. Elsheikh2
, M. E. Fouda3,†
and A. M. Eltawil1
1 Department of ECE, CEMSE Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
2 Department of Mathematics and Engineering Physics, Faculty of Engineering, Cairo University, Giza 12613, Egypt
3 Rain Neuromorphics, Inc., San Francisco, CA, 94110, USA
†Email:foudam@uci.edu


End-to-End Edge Neuromorphic Object Detection System

Abstract—Neuromorphic accelerators are emerging as a potential solution to the growing power demands of Artificial
Intelligence (AI) applications. Spiking Neural Networks (SNNs), which are bio-inspired architectures, are being considered as a way to address this issue. Neuromorphic cameras, which operate on a similar principle, have also been developed, offering low power consumption, microsecond latency, and robustness in various lighting conditions.

This work presents a full neuromorphic
system for Computer Vision, from the camera to the processing hardware, with a focus on object detection. The system was evaluated on a compiled real-world dataset and a new synthetic dataset generated from existing videos, and it demonstrated good performance in both cases. The system was able to make accurate predictions while consuming 66mW, with a sparsity of 83%, and a time response of 138ms.

View attachment 77018


VI. CONCLUSION AND FUTURE WORK
This work showed a low-power and real-time latency full spiking neuromorphic system for object detection based on IniVation’s DVXplorer Lite event-based camera and Brainchip’s Akida AKD1000 spiking platform. The system was evaluated on three different datasets, comprising real-world and synthetic samples. The final mapped model achieved mAPs of 28.58 for the GEN1 dataset, equivalent to 54% of a more complex state of-the-art model and 89% of the performance detection from the best-reported result for the single-class dataset PEDRo, having 17x less parameters. A power consumption of 66mW and a latency of 138.88ms were reported, being suitable for real-time edge applications.

For future works, different models are expected to be adapted to the Akida platform, from which more recent
releases of the YOLO family can be implemented. Moreover, it is expected to evaluate those models in real-world scenarios instead of recordings, as well as the acquisition of more data to evaluate this setup under different challenging situations.

Super sleutiing FMF!

M Fouda's linkedin confirms the RainAi connexion:

https://www.linkedin.com/in/mefouda/recent-activity/all/

There is also a repost originally from Prof Salama about the article.

A great thing about the diagram is that it illustrates the synergistic relationship between Akida and Edge Impulse in creating the model.

Anothet thing the article shows is that Akida's agnosticism extends beyond processors to DVS cameras, including Prophesee's arch rival Inivation.
1738369985620.png
 
Last edited:
  • Like
  • Fire
Reactions: 11 users

Pmel

Regular
  • Like
Reactions: 2 users

JB49

Regular
Our friend at Raytheon, celebrated by our new radar friend that was on the last podcast. I wonder what innovative ideas they are referring to... There seems to be one that Tom is particularly interested in of late...

1738371041125.png


Raytheon friend also reposting the podcast!

1738370993057.png
 
  • Like
  • Love
  • Fire
Reactions: 7 users

jtardif999

Regular
A fellow forum user who in recent months repeatedly referred to his brief LinkedIn exchange with Mercedes-Benz Chief Software Officer Magnus Östberg (and thereby willingly revealed his identity to all of us here on TSE, which in turn means I’m not guilty of spilling a secret with this post that should have been kept private), asked Mercedes-Benz a question in the comment section underneath the company’s latest LinkedIn post on neuromorphic computing. This time, however, he decided not to share the carmaker’s reply with all of us here on TSE. You gotta wonder why.

Could it possibly have to do with the fact that MB’s reply refutes the hypothesis he had been advancing for months, namely that Mercedes-Benz, who have been heavily promoting their future SDV (software defined vehicle) approach that gives them the option of OTA (over-the-air) updates, would “more than likely” have used Akida 2.0/TENNs simulation software in the upcoming MB.OS release as an interim solution during ongoing development until the not-yet existing Akida 2.0 silicon became available at a later stage? The underlying reason being competitive pressure to be first-to-market…

The way I see it, the January 29 reply by MB clearly puts this speculation to bed:



View attachment 77012


Does that sound as if an MB.OS “Akida inside” reveal at the upcoming world premiere of the CLA were on the cards?


Setting aside the questions

a) about any requirements for testing and certification of car parts making up the infotainment system (being used to German/EU bureaucracy, I find it hard to believe there wouldn’t be any at all - maybe someone who is knowledgeable about automotive regulations within Germany and the EU could comment on this) and

b) whether any new MB model containing our tech could roll off the production line despite no prior IP license deal having been signed (or at least an Akida 1.0 chips sales deal; there has never been a joint development announcement either which could possibly somehow circumvent the necessity of an upfront payment showing up in our financials)….

… various MB statements in recent months (cf. Dominik Blum’s presentation at HKA Karlsruhe I shared in October: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352 - the university of applied sciences they have since confirmed to cooperate with regarding research on neuromorphic cameras, journalists quoting MB engineers after having visited their Future Technologies Lab as well as relevant posts and comments on LinkedIn) have diminished the likelihood of neuromorphic tech making its debut in any soon to be released Mercedes-Benz models.

If NC were to enhance voice control and infotainment functions in their production vehicles much sooner than safety-critical ones (ADAS), MB would surely have clarified this in their reply to the above question posed to them on LinkedIn, which specifically referred to the soon-to-be released CLA, which happens to be the first model to come with the next-generation MB.OS that also boasts the new AI-powered MBUX Virtual Assistant (developed in collaboration with Google).

Instead, they literally wrote:

“(…) To fully leverage the potential of neuromorphic processes, specialised hardware architectures that efficiently mimic biologically inspired systems are required (…) we’re currently looking into Neuromorphic Computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years.”

They are evidently exploring full scale integration to maximise the benefits of energy efficiency, latency and privacy. The voice control implementation of Akida in the Vision EQXX was their initial proof-of-concept to demonstrate feasibility of NC in general (cf. the podcast with Steven Peters, MB’s former Head of AI Research from 2016-2022: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-407798). Whether they’ll eventually partner with us or a competitor (provided they are happy with their research project’s results) remains to be seen.

So I certainly do not expect the soon-to-be revealed CLA 2025 with the all-new MB.OS to have “Akida inside”, although I’d be more than happy to be proven wrong, as we’d all love to see the BRN share price soar on these news…
Time - and the financials - will ultimately tell.
I fail to see why MB would choose Akida for a successful POC with the EQXX and then move on to test other NPs afterwards? We know that they had Loihi in their hands before Akida was chosen for the EQXX anyway. It’s just would be illogical for them to take backward steps ats, particularly as has been pointed out independently by the space industry and others in their testing that Akida is the first choice simply because it’s the only truly commercially available and fully supported NP currently on the market. AIMO.
 
  • Like
  • Fire
Reactions: 6 users

Frangipani

Regular
Hopefully, researchers in Jörg Conradt’s Neuro Computing Systems lab that moved from Munich (TUM) to Stockholm (KTH), will give Akida another chance one of these days (after the not overly glorious assessment of two KTH Master students in their degree project Neuromorphic Medical Image Analysis at the Edge, which was shared here before: https://www.diva-portal.org/smash/get/diva2:1779206/FULLTEXT01.pdf), trusting the positive feedback by two more advanced researchers Jörg Conradt knows well, who have (resp soon will have) first-hand-experience with AKD1000:

When he was still at TUM, Jörg Conradt was the PhD supervisor of Cristian Axenie (now head of SPICES lab at TH Nürnberg, whose team came runner-up in the 2023 tinyML Pedestrian Detection Hackathon utilising Akida) and co-authored a number of papers with him, and now at Stockholm, he is the PhD supervisor of Jens Egholm Pedersen, who is one of the co-organisers of the topic area Neuromorphic systems for space applications at the upcoming Telluride Neuromorphic Workshop, that will provide participants with neuromorphic hardware, including Akida. (I’d venture a guess that the name Jens on the slide refers to him).

On 28 January, Jörg Conradt, who leads the Neuro Computing Systems Lab at KTH Royal Institute of Technology in Stockholm, gave a presentation titled "Energy-smart Neuromorphic Sensing and Computation for Future Space Applications” at the AI for Space Applications Workshop that took place at KTH’s Digital Futures hub.
The workshop was held as part of the ASAP project:

6E2CFC85-AEE9-45EB-91D8-A290A7324559.jpeg





2D8EF18A-3C39-4997-A97C-1D3857B3546D.jpeg


While the first presentation slide didn’t look too exciting from a BRN shareholder’s perspective…

278C5E5A-92D4-4BA9-B277-CFAE4E425574.jpeg


… we actually did get a mention later on, when Jörg Conradt touched upon a number of neuromorphic startups…

B0FB3B90-B320-4788-9938-0D42CCB53878.jpeg


…although he didn’t get it quite right. Not only did he mix up the names of our company and its neuromorphic processor - he also misspelled and pronounced the latter as “Aikida”. And since when does BrainChip sound like a place name?

To be fair, though, he actually gave us an “out of this world” mention by pointing to what he mistook to be our company’s name and telling his audience (from 14:50 min):

“In fact, this company develops IP, and a very big Sweden-based space company has recently teamed up with Aikida [sic] to develop hardware - computing hardware - that they can send into space, that is potentially very robust against radiation and other effects.
So we have a common project starting on what to do with that hardware, and that’s my point of contact into space.”

It sounds as if Jörg Conradt’s lab at KTH will somehow be involved with whatever Frontgrade Gaisler has planned for Akida…
All the more reason he ought to familiarise himself with A & B ASAP. 😉

Encouragingly, Akida made another appearance shortly after (this time with impeccable orthography) - if only for a split second, before Jörg Conradt moved on to the next slide. It was in the context of a research project titled Neuromorphic Edge Computing for Urban Traffic Monitoring in the city of Stockholm, funded by Digital Futures, a cross-disciplinary research centre established by KTH Royal Institute of Technology, Stockholm University and RISE Research Institutes of Sweden. Akida was listed alongside SpiNNaker 2 and Loihi 2 under “neuromorphic chips” and even had the honour of providing an exemplary image for that category.

8B451C30-8029-4372-8DBB-397F86ED7CED.jpeg



CB5D86A5-3A5F-4DA4-9793-01F11FC02298.jpeg


E4A05795-5022-45A3-8673-22F12A7BA8AF.jpeg


Jörg Conradt didn’t specify whether or not the different neuromorphic processors were going to get benchmarked against each other, but I assume that’s the plan.


Here is some further information I found about said project that runs from January 2024 to December 2025 and “estimates a 100x reduction in power and a 20x reduction in installation cost.”


694361E4-3DE1-4A77-A71E-CEFE122C8424.jpeg


860E4FC1-F8B6-4E31-8DA1-9AF58500E7B7.jpeg

D368FBDC-BD4F-4DD3-909D-261E713E115D.jpeg


I wonder whether this urban traffic monitoring project with the help of event-based cameras was somehow inspired by the 2023 tinyML Pedestrian Detection Hackathon submission (utilising Akida) from Cristian Axenie’s SPICES lab at TH Nürnberg. It doesn’t seem that far fetched to make a connection - after all, they know each other well: Jörg Conradt was Cristian Axenie’s PhD supervisor at TUM (Technical University of Munich).
 
  • Like
  • Fire
  • Love
Reactions: 7 users

Frangipani

Regular
Speaking of Cristian Axenie:
He just launched a new website that I am sure @Sirod69 will love:


8752712D-EE92-4857-8F82-77C268D14271.jpeg


Akida is hiding behind the upper left research project (footage from the 2023 tinyML Pedestrian Detection Hackathon submission).
A cursory glance at the other projects couldn’t make out our hardware anywhere else under “Academics” or “Research”.

It might, however, be worthwhile to check out this new project page from time to time, especially once other scientists and engineers have come on board.


CA564CDB-E61C-4D84-A5CD-F94DE58BE49C.jpeg
 
  • Like
  • Love
  • Thinking
Reactions: 4 users
Are we not an IP company? Those companies produce chips. Maybe that’s why we’re not on the list 🤔
The headline says top 10 edge AI chips. The article continues……

“Looking for an AI processor for a computer vision system, a robot or a home appliance? These top 10 edge AI chips are designed to accelerate AI workloads without being power-hungry.”

Doesn’t say chip producers nor chip IP companies. Just chips. Now wouldn’t Brainchips Akida fit the headline and the opening editorial paragraph.


My guess it’s a paid editorial and Brainchip chose not to pay.
 
  • Like
  • Fire
Reactions: 3 users
Top Bottom