BRN Discussion Ongoing

7für7

Top 20
Why some people doesn’t share the link of The content they share? Is it more difficult than making screenshots, transforming them and post them?
 
  • Like
Reactions: 2 users
IMG_0210.jpeg


I think we know the priority of business for Yuya’s first day on deck !!!
 
  • Like
  • Haha
  • Fire
Reactions: 5 users
Morning fellow brners, happy Friday!
We should be receiving the top 20 shareholders notice today? I checked back on previous notices and quite a number of top 20 increased their shareholdings so will be interesting to compare when this one comes out. O to be a top 20...
Whatever our individual investments are we are part of this growing company and soon imo Brainchip will be thriving just like Arm, Nividia , Amazon and the rest of them that grew into giants
Feeling positive on the last day of January. Looking forward to seeing the Top 20 chart. Bring it on, boomidty boom boom


Has the list come out today?
 
  • Haha
  • Like
Reactions: 4 users

Frangipani

Top 20
About six months ago, I posted a video which showed that researchers at UC Irvine’s Cognitive Anteater Robotics Laboratory (CARL), led by Jeff Krichmar, had been experimenting with AKD1000 mounted on an E-Puck2 robot.

The April 2024 paper I linked to at the time (“An Integrated Toolbox for Creating Neuromorphic Edge Applications”), co-authored by Lars Niedermeier (Niedermeier Consulting, Zurich) and Jeff Krichmar (UC Irvine), did not yet contain a reference to Akida, but has recently been updated to a newer version (Accepted Manuscript online 22 January 2025). It now has heaps of references to AKD1000 and describes how it was used for visual object detection and classification.

Nikil Dutt, one of Jeff Krichmar’s colleagues at UC Irvine and also member of the CARL team, contributed to this Accepted Manuscript version as an additional co-author.



What caught my eye was that the researchers, who had used an AKD1000 PCIe Board (with an engineering sample chip) as part of their hardware stack, had already gotten their hands on an Akida M.2 form factor as well, even though BrainChip’s latest offering wasn’t officially revealed until January 8th at CES 2025:

“For productive deployments, the Raspberry Pi 5 19 Compute Module and Akida.M2 form factor were used.” (page 9)


Maybe thanks to Kristofor Carlson?




Here are some pages from the Accepted Manuscript version:


View attachment 76552


View attachment 76553



View attachment 76554


View attachment 76558


View attachment 76556
View attachment 76557


We already knew from the April 2024 version of that paper that…



And finally, here’s a close-up of the photo on page 9:

View attachment 76555

Just an afterthought…

Academic research utilising Akida shouldn’t generally be underestimated or dismissed as mere playtime in an ivory tower.

Some of these researchers have excellent connections to big players in the industry and/or to government agencies and sometimes even prior work experience in relevant sectors themselves - hence their recommendations would likely be given quite a bit of weight.

Take Jeff Krichmar👆🏻for example, whose 27 page (!) CV can be found on his LinkedIn profile.


Krichmar’s first job after graduating with a Bachelor in Computer Science (and before going to grad school to pursue his Master’s) was that of a software engineer at Raytheon Corporation (now RTX), working on the PATRIOT surface-to-air missile system - a position, which also saw him become a consultant to the Japanese Self-Defense Forces from 1988-1989, while deployed to Mitsubishi Heavy Industries in Nagoya (which to this day is manufacturing PATRIOT missiles for domestic use under license from RTX and Lockheed Martin).


View attachment 76748


Over the years, he has received quite a bit of funding from the defence-related sector, mostly from the US government, but also from Northrop Grumman.

View attachment 76751

In 2015 he gave an invited talk at Northrop Grumman…

View attachment 76752

… and he was co-author of a paper published in November 2016, whose first author, his then graduate student Tiffany Hwu, was a Basic Research Systems Engineer Intern with Northrop Grumman at the time. (“This work was supported by the National Science Foundation Award number 1302125 and Northrop Grumman Aerospace Systems.”)

The neuromorphic hardware used for the self-driving robot was unsurprisingly IBM’s TrueNorth, as this was then the only neuromorphic chip around - Loihi wasn’t announced until September 2017.

View attachment 76756

One of the paper’s other co-authors was a former postdoctoral student of Krichmar’s, Nicolas Oros, who had started working for BrainChip in December 2014 - on his LinkedIn profile it says he was in fact our company’s first employee! He is also listed as co-inventor of the Low power neuromorphic voice activation system and method patent alongside Peter van der Made and Mouna Elkhatib.

Nicolas Oros left BrainChip in February 2021 and is presently a Senior Product Manager at Aicadium, “leading the development of a computer vision SaaS product for visual inspection”. I don’t think we’ve ever looked into them? 🤔

View attachment 76754


View attachment 76755


By the time of said paper’s publication, Jeff Krichmar had become a member of BrainChip’s Scientific Advisory Board - see this link of an April 2016 BRN presentation, courtesy of @uiux:


View attachment 76753

As mentioned before, Kristofor Carlson is another of Jeff Krichmar’s former postdoctoral students (from 2011-2015), who co-authored a number of research papers with Jeff Krichmar and Nikil Dutt (both UC Irvine) over the years - the last one published in 2019.

In September, Kris Carlson gave a presentation on TENNs at UC Irvine, as an invited speaker at SAB 2024: From Animals to Animats - 17th International Conference on the Simulation of Adaptive Behavior.

View attachment 76815

Kris Carlson’s September 2024 conference talk on TENNs and the CARL lab’s recent video and paper featuring an E-Puck2 robot, which had an Akida PCIe Board mounted on top, as well as the additional info contained in that 22 January 2025 paper that CARL researchers had already experimented with the brand new AKD1000 M.2 form factor is ample evidence that there is continued interest in what BrainChip is doing from Jeff Krichmar’s side.

Academic researchers like him could very well be door openers to people in charge of other entities’ research that will result in meaningful revenue one day…

The above paper 👆🏻 on CARLsim++ (which got implemented on Akida hardware in the researchers’ experiments) just got a lot more exposure thanks to LinkedIn posts by two of its co-authors…


4BC76BB9-A302-4400-9969-41B70327F09F.jpeg


… and further dissemination by others:


D14CD3FA-5A42-4903-9297-0E3BC10C6069.jpeg
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 25 users

BrainShit

Regular
The above paper 👆🏻 on CARLsim++ (which got implemented on Akida hardware in the researchers’ experiments) just got a lot more exposure thanks to LinkedIn posts by two of its co-authors…


View attachment 77005

In addition the related paper:
1000068833.jpg


Source: https://iopscience.iop.org/article/10.1088/2634-4386/adad0f
 

Attachments

  • Screenshot_20250131_100953_DuckDuckGo.jpg
    Screenshot_20250131_100953_DuckDuckGo.jpg
    387.3 KB · Views: 59
Last edited:
  • Like
  • Fire
  • Love
Reactions: 45 users

Frangipani

Top 20
A fellow forum user who in recent months repeatedly referred to his brief LinkedIn exchange with Mercedes-Benz Chief Software Officer Magnus Östberg (and thereby willingly revealed his identity to all of us here on TSE, which in turn means I’m not guilty of spilling a secret with this post that should have been kept private), asked Mercedes-Benz a question in the comment section underneath the company’s latest LinkedIn post on neuromorphic computing. This time, however, he decided not to share the carmaker’s reply with all of us here on TSE. You gotta wonder why.

Could it possibly have to do with the fact that MB’s reply refutes the hypothesis he had been advancing for months, namely that Mercedes-Benz, who have been heavily promoting their future SDV (software defined vehicle) approach that gives them the option of OTA (over-the-air) updates, would “more than likely” have used Akida 2.0/TENNs simulation software in the upcoming MB.OS release as an interim solution during ongoing development until the not-yet existing Akida 2.0 silicon became available at a later stage? The underlying reason being competitive pressure to be first-to-market…

The way I see it, the January 29 reply by MB clearly puts this speculation to bed:



C173E434-EDA3-4EA2-88FC-86DE24B2614E.jpeg



Does that sound as if an MB.OS “Akida inside” reveal at the upcoming world premiere of the CLA were on the cards?


Setting aside the questions

a) about any requirements for testing and certification of car parts making up the infotainment system (being used to German/EU bureaucracy, I find it hard to believe there wouldn’t be any at all - maybe someone who is knowledgeable about automotive regulations within Germany and the EU could comment on this) and

b) whether any new MB model containing our tech could roll off the production line despite no prior IP license deal having been signed (or at least an Akida 1.0 chips sales deal; there has never been a joint development announcement either which could possibly somehow circumvent the necessity of an upfront payment showing up in our financials)….

… various MB statements in recent months (cf. Dominik Blum’s presentation at HKA Karlsruhe I shared in October: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352 - the university of applied sciences they have since confirmed to cooperate with regarding research on neuromorphic cameras, journalists quoting MB engineers after having visited their Future Technologies Lab as well as relevant posts and comments on LinkedIn) have diminished the likelihood of neuromorphic tech making its debut in any soon to be released Mercedes-Benz models.

If NC were to enhance voice control and infotainment functions in their production vehicles much sooner than safety-critical ones (ADAS), MB would surely have clarified this in their reply to the above question posed to them on LinkedIn, which specifically referred to the soon-to-be released CLA, which happens to be the first model to come with the next-generation MB.OS that also boasts the new AI-powered MBUX Virtual Assistant (developed in collaboration with Google).

Instead, they literally wrote:

“(…) To fully leverage the potential of neuromorphic processes, specialised hardware architectures that efficiently mimic biologically inspired systems are required (…) we’re currently looking into Neuromorphic Computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years.”

They are evidently exploring full scale integration to maximise the benefits of energy efficiency, latency and privacy. The voice control implementation of Akida in the Vision EQXX was their initial proof-of-concept to demonstrate feasibility of NC in general (cf. the podcast with Steven Peters, MB’s former Head of AI Research from 2016-2022: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-407798). Whether they’ll eventually partner with us or a competitor (provided they are happy with their research project’s results) remains to be seen.

So I certainly do not expect the soon-to-be revealed CLA 2025 with the all-new MB.OS to have “Akida inside”, although I’d be more than happy to be proven wrong, as we’d all love to see the BRN share price soar on these news…
Time - and the financials - will ultimately tell.
 
  • Like
  • Thinking
  • Love
Reactions: 28 users

itsol4605

Regular
A fellow forum user who in recent months repeatedly referred to his brief LinkedIn exchange with Mercedes-Benz Chief Software Officer Magnus Östberg (and thereby willingly revealed his identity to all of us here on TSE, which in turn means I’m not guilty of spilling a secret with this post that should have been kept private), asked Mercedes-Benz a question in the comment section underneath the company’s latest LinkedIn post on neuromorphic computing. This time, however, he decided not to share the carmaker’s reply with all of us here on TSE. You gotta wonder why.

Could it possibly have to do with the fact that MB’s reply refutes the hypothesis he had been advancing for months, namely that Mercedes-Benz, who have been heavily promoting their future SDV (software defined vehicle) approach that gives them the option of OTA (over-the-air) updates, would “more than likely” have used Akida 2.0/TENNs simulation software in the upcoming MB.OS release as an interim solution during ongoing development until the not-yet existing Akida 2.0 silicon became available at a later stage? The underlying reason being competitive pressure to be first-to-market…

The way I see it, the January 29 reply by MB clearly puts this speculation to bed:



View attachment 77012


Does that sound as if an MB.OS “Akida inside” reveal at the upcoming world premiere of the CLA were on the cards?


Setting aside the questions

a) about any requirements for testing and certification of car parts making up the infotainment system (being used to German/EU bureaucracy, I find it hard to believe there wouldn’t be any at all - maybe someone who is knowledgeable about automotive regulations within Germany and the EU could comment on this) and

b) whether any new MB model containing our tech could roll off the production line despite no prior IP license deal having been signed (or at least an Akida 1.0 chips sales deal; there has never been a joint development announcement either which could possibly somehow circumvent the necessity of an upfront payment showing up in our financials)….

… various MB statements in recent months (cf. Dominik Blum’s presentation at HKA Karlsruhe I shared in October: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352 - the university of applied sciences they have since confirmed to cooperate with regarding research on neuromorphic cameras, journalists quoting MB engineers after having visited their Future Technologies Lab as well as relevant posts and comments on LinkedIn) have diminished the likelihood of neuromorphic tech making its debut in any soon to be released Mercedes-Benz models.

If NC were to enhance voice control and infotainment functions in their production vehicles much sooner than safety-critical ones (ADAS), MB would surely have clarified this in their reply to the above question posed to them on LinkedIn, which specifically referred to the soon-to-be released CLA, which happens to be the first model to come with the next-generation MB.OS that also boasts the new AI-powered MBUX Virtual Assistant (developed in collaboration with Google).

Instead, they literally wrote:

“(…) To fully leverage the potential of neuromorphic processes, specialised hardware architectures that efficiently mimic biologically inspired systems are required (…) we’re currently looking into Neuromorphic Computing as part of a research project. Depending on the further development progress, integration could become possible within a timeframe of 5 to 10 years.”

They are evidently exploring full scale integration to maximise the benefits of energy efficiency, latency and privacy. The voice control implementation of Akida in the Vision EQXX was their initial proof-of-concept to demonstrate feasibility of NC in general (cf. the podcast with Steven Peters, MB’s former Head of AI Research from 2016-2022: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-407798). Whether they’ll eventually partner with us or a competitor (provided they are happy with their research project’s results) remains to be seen.

So I certainly do not expect the soon-to-be revealed CLA 2025 with the all-new MB.OS to have “Akida inside”, although I’d be more than happy to be proven wrong, as we’d all love to see the BRN share price soar on these news…
Time - and the financials - will ultimately tell.
5 to 10 years ... 2030..2035 maybe
 
  • Like
Reactions: 3 users
Couldn't find this conference paper from 2024 posted but maybe my search didn't capture it.

Anyway, was a positive presentation at a IEEE conference and below are a couple of snips and as they advise, "This is an accepted manuscript version of a paper before final publisher editing and formatting. Archived with thanks to IEEE."

HERE

As the authors acknowledge:

"This work has been partially supported by King Abdullah University of Science and Technology CRG program under grant number: URF/1/4704-01-01.

We would like also to thank Edge Impulse and Brainchip companies for providing us with the software tools and hardware platform used during this work.*

One of the authors, M E Fouda, caught my eye given his/her relationship with "3", maybe employer...hmmmm.

D. A. Silva1
, A. Shymyrbay1
, K. Smagulova1
, A. Elsheikh2
, M. E. Fouda3,†
and A. M. Eltawil1
1 Department of ECE, CEMSE Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
2 Department of Mathematics and Engineering Physics, Faculty of Engineering, Cairo University, Giza 12613, Egypt
3 Rain Neuromorphics, Inc., San Francisco, CA, 94110, USA
†Email:foudam@uci.edu


End-to-End Edge Neuromorphic Object Detection System

Abstract—Neuromorphic accelerators are emerging as a potential solution to the growing power demands of Artificial
Intelligence (AI) applications. Spiking Neural Networks (SNNs), which are bio-inspired architectures, are being considered as a way to address this issue. Neuromorphic cameras, which operate on a similar principle, have also been developed, offering low power consumption, microsecond latency, and robustness in various lighting conditions.

This work presents a full neuromorphic
system for Computer Vision, from the camera to the processing hardware, with a focus on object detection. The system was evaluated on a compiled real-world dataset and a new synthetic dataset generated from existing videos, and it demonstrated good performance in both cases. The system was able to make accurate predictions while consuming 66mW, with a sparsity of 83%, and a time response of 138ms.

IMG_20250131_231420.jpg



VI. CONCLUSION AND FUTURE WORK
This work showed a low-power and real-time latency full spiking neuromorphic system for object detection based on IniVation’s DVXplorer Lite event-based camera and Brainchip’s Akida AKD1000 spiking platform. The system was evaluated on three different datasets, comprising real-world and synthetic samples. The final mapped model achieved mAPs of 28.58 for the GEN1 dataset, equivalent to 54% of a more complex state of-the-art model and 89% of the performance detection from the best-reported result for the single-class dataset PEDRo, having 17x less parameters. A power consumption of 66mW and a latency of 138.88ms were reported, being suitable for real-time edge applications.

For future works, different models are expected to be adapted to the Akida platform, from which more recent
releases of the YOLO family can be implemented. Moreover, it is expected to evaluate those models in real-world scenarios instead of recordings, as well as the acquisition of more data to evaluate this setup under different challenging situations.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Baneino

Regular
In my opinion, we should not become too fixated on Mercedes-Benz, because our major growth will not depend on whether we are a partner or whether Mercedes-Benz or a customer, but the many individual customers will be the straw that breaks the camel's back. And our technology will be used by thousands of companies at some point and they will pay for it. Mercedes-Benz is a well-known company, but I wouldn't put too much weight on it.

Best regards from Germany
 
  • Like
  • Fire
  • Love
Reactions: 43 users

GazDix

Regular
Has the list come out today?
Hi HG.
I emailed Investor Relations and they haven't responded yet.
Every quarter we get one, sometimes later than usual.
 
  • Like
  • Fire
Reactions: 6 users
Another positive paper from 2024 I hadn't seen before but maybe posted somewhere.

Interesting small satellite project group appearing to be working with NASA and also have another related paper which I'm pretty has been discussed before...Brainsat.

This paper:

HERE

Who is the Space Generation Advisory Council?


We are the VOICE of the largest network of students, young professionals and alumni in the space industry.​

29,000+
Members Globally
165+
Countries Worldwide
30+
SGAC Events Annually
11
Active Project Groups


Spiking Neural Network Design for on-board detection of methane emissions through Neuromorphic Computing

Andrew Karima∗
, Amel AlKholeifya
, Jimin Choia
, Jatin Dhalla
, Tan Hudaa
, Arnav Ranjekara
,
Yassine Yousfia
, Daniel Wischerta
a Space Generation Advisory Council, Small Satellite Project Group
* Corresponding author ( andrew.karim@spacegeneration.org andrew.karim@spacegeneration.org andrew.karim@spacegeneration.org)


Abstract

Small satellite constellations have shown tremendous success for Earth observation missions and can mimic the
performance of large satellite platforms while being cheaper and faster to deploy.

Detecting fugitive methane emissions
from ageing oil and gas infrastructures is an important use-case, as it helps facility operators to locate and mitigate these
leaks, ultimately addressing the global climate crisis. For such applications, it is crucial for the data to be transmitted
to the end-user in a timely manner for appropriate actions to take place. Edge Artificial Intelligence (AI) can address
this issue by pre-processing the data on-board the spacecraft, dramatically reducing the amount of data to be sent to
the ground and thus improving response times and bandwidth. As small satellites are greatly constrained by a low size,
weight and power (SWaP), their ability to process AI algorithms with fewer computational resources is essential.

AI applications often depend on real time data processing and analysis, yet most modern computers are inefficient in
these tasks.
Neuromorphic Computing’s (NC) parallel and distributed architecture helps perform complex calculations
faster while using less power, enabling more efficient in-orbit data processing and onboard adaptive learning. This paper evaluates the potential of a novel 6U CubeSat neuromorphic on-board computer, applied to a case study for point-source methane emissions monitoring. It involves the design of a Spiking Neural Network (SNN) tailored for small satellite platforms.

The necessary processing steps for detection of point-source methane emissions using hyperspectral imagery are presented, and the potential of edge computing is discussed. A full processing pipeline enabling real-time monitoring
is proposed, which involved an SNN for segmentation of methane plumes. The architectural specifications necessary for
neuromorphic computing in the resource-constrained satellite environment are outlined. NC principles are incorporated
into the SNN architecture design, leveraging event-driven computation to improve energy efficiency and computational throughput. The processor’s performance is compared with traditional computing approaches.

Finally, a plan to train the model using an annotated dataset from the AVIRIS-NG hyperspectral instrument is presented, although training and validation are left for future work.

By embracing neuromorphic computing, an innovative approach inspired by the brain, satellites can achieve unparalleled efficiency and accuracy. The SNN model presented contributes to the scarce body of research on edge AI for greenhouse gas monitoring, providing a step forward towards real-time Earth observation.

Excerpt:

The designed OBP includes two PC104 modules, connected through a mezzanine connector. It integrates both CPU and FPGA capabilities and can cater for on board computer functions, payload data processing and down link management. The OBP is equipped with the Akida
1000 neuromorphic processor, selected for its design maturity, real-time processing capabilities and flight heritage.


The chip is tailored for event-based processing, featuring 80 Neuromorphic Processing Units (NPUs) with 100 KB
of SRAM each, supporting up to 1.2 million virtual neurons and 10 billion virtual synapses with up to 8-bit pre-
cision, ideal for inference-heavy tasks. An FPGA provides glue logic to implement necessary data protocols and serve as a soft CPU for OBC functions.

Additionally, a 12GB flight-proven Micron Solid State Device (SSD) is included to provide non-volatile on-board memory. This sub-system is constrained to a 0.5 U volume and is estimated to consume a low power of less than 4 W.

The processor’s software architecture was designed to minimize resource consumption. It is equipped with Real-
Time Executive for Multiprocessor Systems (RTEMS) as its Real Time Operating System (RTOS), on top of which
NASA’s core Flight System (cFS) is used. cFS provides a flight proven product on which OBP functions are built using open-source applications.
For more information about the BrainSat architecture, refer to our co-published pro-
ceeding [12].


6. Conclusion

In this paper, we have proposed a hyperspectral imagery on-board data processing pipeline that can pave the way for fully autonomous methane detection systems using small satellites. After converting the measured raw hyperspectral data to measured at-sensor radiance, a matched filter algorithm is applied, which implies comparing the methane enhancement above background with a simulated target absorption spectrum. This result is then
fed to a Spiking Neural Network, inspired by the Hyper-STARCOP architecture but tailored to neuromorphic computers trained using a recent hybrid co-training approach.

This allows to polish the methane plume masks by reducing the false positives, which can be downlinked to mission operators for further processing. The principal bottleneck remains using a radiative transfer model on-board the platform to simulate the methane absorption spectrum according to observation parameters, which is not evaluated in this work.

The approach presented reduces the steep bandwidth requirements for hyperspectral imagers by reducing the
size of the downlinked data by a factor of ten.
Limited to a maximum power usage of 4 W and a volume of 0.5 U,
the proposed hardware is ideal for CubeSats and enables several other Earth Observation applications, such as onboard cloud detection and image compression or classification.
 
  • Like
  • Fire
  • Love
Reactions: 35 users

IloveLamp

Top 20
  • Like
  • Fire
  • Love
Reactions: 44 users

TECH

Regular
Couldn't find this conference paper from 2024 posted but maybe my search didn't capture it.

Anyway, was a positive presentation at a IEEE conference and below are a couple of snips and as they advise, "This is an accepted manuscript version of a paper before final publisher editing and formatting. Archived with thanks to IEEE."

HERE

As the authors acknowledge:

"This work has been partially supported by King Abdullah University of Science and Technology CRG program under grant number: URF/1/4704-01-01.

We would like also to thank Edge Impulse and Brainchip companies for providing us with the software tools and hardware platform used during this work.*

One of the authors, M E Fouda, caught my eye given his/her relationship with "3", maybe employer...hmmmm.

D. A. Silva1
, A. Shymyrbay1
, K. Smagulova1
, A. Elsheikh2
, M. E. Fouda3,†
and A. M. Eltawil1
1 Department of ECE, CEMSE Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
2 Department of Mathematics and Engineering Physics, Faculty of Engineering, Cairo University, Giza 12613, Egypt
3 Rain Neuromorphics, Inc., San Francisco, CA, 94110, USA
†Email:foudam@uci.edu


End-to-End Edge Neuromorphic Object Detection System

Abstract—Neuromorphic accelerators are emerging as a potential solution to the growing power demands of Artificial
Intelligence (AI) applications. Spiking Neural Networks (SNNs), which are bio-inspired architectures, are being considered as a way to address this issue. Neuromorphic cameras, which operate on a similar principle, have also been developed, offering low power consumption, microsecond latency, and robustness in various lighting conditions.

This work presents a full neuromorphic
system for Computer Vision, from the camera to the processing hardware, with a focus on object detection. The system was evaluated on a compiled real-world dataset and a new synthetic dataset generated from existing videos, and it demonstrated good performance in both cases. The system was able to make accurate predictions while consuming 66mW, with a sparsity of 83%, and a time response of 138ms.

View attachment 77018


VI. CONCLUSION AND FUTURE WORK
This work showed a low-power and real-time latency full spiking neuromorphic system for object detection based on IniVation’s DVXplorer Lite event-based camera and Brainchip’s Akida AKD1000 spiking platform. The system was evaluated on three different datasets, comprising real-world and synthetic samples. The final mapped model achieved mAPs of 28.58 for the GEN1 dataset, equivalent to 54% of a more complex state of-the-art model and 89% of the performance detection from the best-reported result for the single-class dataset PEDRo, having 17x less parameters. A power consumption of 66mW and a latency of 138.88ms were reported, being suitable for real-time edge applications.

For future works, different models are expected to be adapted to the Akida platform, from which more recent
releases of the YOLO family can be implemented. Moreover, it is expected to evaluate those models in real-world scenarios instead of recordings, as well as the acquisition of more data to evaluate this setup under different challenging situations.

There's our mate again AKD 1000...................doing us ALL PROUD !!!!!!
 
  • Like
  • Love
Reactions: 21 users
Interesting read on the concerns over Deepseek:


"US officials are now investigating whether DeepSeek purchased NVIDIA chips through intermediaries in Singapore, effectively circumventing the AI restrictions the government had employed, Bloomberg reported."

"Nvidia’s chips, which they bought tons of, and they found their ways around it, drive their DeepSeek model […] It has got to end. If they are going to compete with us, let them compete, but stop using our tools to compete with us. So I am going to be very strong on that,” Lutnick said. If confirmed as Commerce Secretary, he would be at the helm of enforcing semiconductor restrictions."

"Typically, AI development has been understood to be very expensive and resource-intensive. Investors have expressed worry about these high costs given the sector’s slow returns. The arrival of DeepSeek and R1 has put this framework into question. After the model’s release on Monday, Nvidia’s market value decreased by nearly $600bn, dropping a staggering 17%. It was the biggest single-day loss in the history of the US stock market."
 
  • Like
  • Thinking
Reactions: 12 users

Slade

Top 20
Interesting read on the concerns over Deepseek:


"US officials are now investigating whether DeepSeek purchased NVIDIA chips through intermediaries in Singapore, effectively circumventing the AI restrictions the government had employed, Bloomberg reported."

"Nvidia’s chips, which they bought tons of, and they found their ways around it, drive their DeepSeek model […] It has got to end. If they are going to compete with us, let them compete, but stop using our tools to compete with us. So I am going to be very strong on that,” Lutnick said. If confirmed as Commerce Secretary, he would be at the helm of enforcing semiconductor restrictions."

"Typically, AI development has been understood to be very expensive and resource-intensive. Investors have expressed worry about these high costs given the sector’s slow returns. The arrival of DeepSeek and R1 has put this framework into question. After the model’s release on Monday, Nvidia’s market value decreased by nearly $600bn, dropping a staggering 17%. It was the biggest single-day loss in the history of the US stock market."
Wondering if BrainChip management are still claiming a three year lead over its competitors.
 
  • Like
  • Thinking
  • Love
Reactions: 8 users
Sally Ward Foxton wrote an article Liked by Rob Telson on LinkedIn. Article is headlined “Top 10 edge AI chips. Now, I thought SWP being a friend of Brainchip and with her vast interactions with Brainchip, would know where Akida fits into the hierarchy of edge chip technology! Evidently according to SWF, Akida doesn’t rank in the top 10. Am I missing something?
IMG_4811.png

https://www.electronicproducts.com/top-10-edge-ai-chips/
 
  • Thinking
  • Like
  • Wow
Reactions: 10 users

charles2

Regular
Wondering if BrainChip management are still claiming a three year lead over its competitors.
Don't know about that but Brainchip and its cloistered management are making me feel 19 again.

19 cents that is.
 
  • Like
  • Haha
Reactions: 8 users
One can only hope we are the secret sauce in one of these companies as it is certainly hard to digest why we’re not mentioned
 
  • Like
Reactions: 3 users

manny100

Regular
Wondering if BrainChip management are still claiming a three year lead over its competitors.
Tony Lewis posted that DeepSeek is a boom for us.
Made a lot of points.
The post has been posted earlier here.
Can't remember details but it was very positive.
 
  • Like
Reactions: 9 users

JoMo68

Regular
Are we not an IP company? Those companies produce chips. Maybe that’s why we’re not on the list 🤔
 
  • Like
  • Fire
  • Wow
Reactions: 30 users
Top Bottom