BRN Discussion Ongoing

Today, numerous articles on some of the promising future technologies Mercedes-Benz is exploring were published online, after the carmaker had recently invited journalists to its Future Technologies Lab in Sindelfingen.

And of course - you guessed it - neuromorphic computing was one of them.
(I also find solar coating another interesting concept).

There was also a press release by MB itself:

View attachment 73221



View attachment 73222

View attachment 73225
View attachment 73226



Playmobil-Männchen im Einsatz… 😀

View attachment 73230
View attachment 73231



German magazine auto, motor und sport published both an online article and a video on MB & neuromorphic computing earlier today (both in German) that literally confirm what I’ve been suspecting: that Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into their serial cars…




“Der Weg in die Serie ist noch weit.” - “It’s still a long way to serial cars”.

“Bis neuromorphe Chips ihren Weg ins Auto finden, wird es wohl noch einige Jahre dauern.” - “It’ll probably take a few years / It looks as if it will still take a few years until neuromorphic chips will find their way into (serial) cars.”


View attachment 73227




“Diese Technologie steht jedoch noch am Anfang und erfordert umfangreiche Tests und Zertifizierungen, bevor sie in Autos eingesetzt werden kann.”

“However, this technology is still in its infancy and requires extensive testing and certification before it can be used in cars.”

Something similar is said in the video itself around the 5 min mark.


Other articles are behind a paywall, but maybe one of you happens to be a subscriber and could check out whether there are any additional snippets of interest worth sharing?





My guess is that the next Mercedes-Benz LinkedIn post regarding NC will give us some more details about the research collaboration with HKA (Hochschule Karlsruhe) on neuromorphic cameras. Or it might be a post announcing the collaboration between Mercedes and Neurobus that I had spotted on Nov 12 (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-441454).
Hi @Frangipani

Thanks for the info.

The comments of certification etc tie in with a role advertised by BRN earlier this year where part of the requirements needed a working knowledge of Auto / ISO design processes etc.

It is something I believe Akida would need to meet either as part of a package or stand alone.

 
  • Like
Reactions: 7 users
This gentleman popped up in a general search. Noticed with us for 5 mths but also a researcher at UC San Diego.

Worked on TENNS with us but is also "working" on ICA for State Space Models for audio denoising.

At UC San Diego he is working on video / LLMs for captioning. Wonder if a fit for us too?



Machine Learning @ BrainChip | MS Machine Learning & Data Science @ UC San Diego​

BrainChip UC San Diego​

San Diego, California, United States Contact Info

  • Machine Learning Engineer​

    BrainChip

    Jul 2024 - Present 5 months
    - Working on implementing Independent Component Analysis (ICA) for effective basis extraction for State Space Models (SSM) to enhance audio denoising performance.
    - Worked on integrating sparsity techniques with Temporal Event-based Neural Network (TENN).
  • UC San Diego Graphic

    Graduate Student Researcher​

    UC San Diego

    Jan 2024 - Present 11 months
    San Diego, California, United States
    - Developed a video captioning pipeline leveraging Large Language Models (LLMs) to capture high-level dynamics/actions in a video.
    - Augmented the captioning pipeline to handle 1M videos on Nvidia H100 cluster.
    - Developed a video filtering pipeline to extract suitable videos from Panda-70M dataset for our captioning pipeline.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

7für7

Top 20
82ED1FF3-7559-42BA-94E7-136C87BFB232.jpeg


Sign the contract big boy… sign the contract….
 
  • Haha
  • Like
  • Fire
Reactions: 10 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Here's a Neurobus Internship starting January 2025. It states that "knowledge of artificial intelligence, neuromorphic computing and autonomous systems a plus".

Good to see major players like Airbus, ESA, and Mercedes-Benz listed here - all of whom are our partners with whom Neurobus also has strategic partnerships.

It makes me wonder if our work with the likes of Neurobus, ESA, Fontrgade Gaisler and Airbus on the next generation microprocessors will eventually see them utilized not only in space operations but also being incorporated into Mercedes-Benz vehicles. The next generation microprocessors are supposed to set a new standard for modern space-grade computing devices. It wouldn't be unheard of for them to be used in automobiles and smart phones IMO, as has been the case with other space-grade microprocessors, because of their demonstrated robustness.

Food for thought.





Screenshot 2024-11-24 at 3.20.01 pm.png




Screenshot 2024-11-24 at 3.20.17 pm.png





 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 33 users

Diogenese

Top 20
Hi @Frangipani

Thanks for the info.

The comments of certification etc tie in with a role advertised by BRN earlier this year where part of the requirements needed a working knowledge of Auto / ISO design processes etc.

It is something I believe Akida would need to meet either as part of a package or stand alone.

Some years ago, the company said that obtaining ISO certification would be done by the vehicle maker. However, that does not relieve the company of responsibility for ensuring the designs are ISO compliant.

Then there is the continuous development of the tech which inhibits customers from committing to silicon - hence, SDV.

I wonder how long it takes to get ISO cert for software, and for software updates. It would require documenttion of the software and of every code change as well as exhaustive testing. I would think that software testing would not take as long as hardware testing.

Hopefully TENNs/Akida2 will be part of the SDV software in the near term. TENNs 01 software has been around for a couple of years and would have undergone extensive testing, but we don't know what the latest version is or when it was available, let alone what's still in the pipeline.
 
  • Like
  • Fire
  • Love
Reactions: 27 users

Getupthere

Regular
Congratulations and Well done to Sean!!

Managed to go through the whole 2024 without achieving anything

Apart from getting his well deserved shares this week.

Great to see management followed through with their commitment to communicate better to their shareholders.

3 years lead has well and truly gone.
 
  • Like
  • Haha
  • Fire
Reactions: 12 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
A TEMPORAL convolutional neural network was used... Tell me more!



Down, But Not Out​

This low-cost wrist-worn fall detector uses a Photon 2, accelerometer, and Edge Impulse to instantly alert first responders to an emergency.​


Nick BildFollow
5 hours ago • Wearables


EXTRACT

Screenshot 2024-11-24 at 5.25.07 pm.png




EXTRACT


Screenshot 2024-11-24 at 5.32.36 pm.png



EXTRACT

Screenshot 2024-11-24 at 5.34.53 pm.png





 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 23 users

manny100

Regular
That's a great point @manny100.

It might be a simplistic way of looking at things, but for me it boils down to two particular areas:
  • performance and efficiency calculated by TOPS/watt, and
  • the unique processing capabilities that BrainChip's technology can bring to the table
If BrainChip's technology can deliver on both of these areas in a meaningful way, then there should be no reason for Qualcomm not to want to integrate it into their products. As has been mentioned numerous times, we are trying to position ourselves as a partner rather than a competitor to behemoths like Qualcomm and NVIDIA.

In terms of determining the TOPS/watt capabilities of Qualcomm's chips, it's a bit tricky because there doesn't seem to be any specified values from the manufacturer on power consumption measurements. In this instance, the authors of one article (published 20 June 2024, linked below) estimated via their own power consumption measurements that "the most efficient power range of the Snapdragon X Elite chips seems to be 20-30 Watts".

AKIDA's Pico on the other hand operates in the microwatt (μW) to milliwatt (mW) range.

When it comes to the process that BrainChip's technology allows for, we can look to Max Maxfiled's latest article "Taking the Size and Power of Extreme Edge AI/ML to the Extreme Minimum" dated 21 Nov 2024. The obvious benefit is that you can "feed the event-based data from the camera directly to the event-based Akida processor, thereby cutting latency and power consumption to a minimum", as compared to other available techniques.

The big question is whether Qualcomm would see any value in adopting this type technology into their own products and I think that Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies, might have actually answered that question in an EE Times article dated 22 March 2023 when he stated the following.


EXTRACT - EE Times article dated 22 March 2023 (Interview with Judd Heape, VP for product management of camera, computer vision and video at Qualcomm Technologies).

View attachment 73247





EXTRACT - Notebook Check 20 June 2024
View attachment 73241


EXTRACT - Notebook Check 20 June 2024
View attachment 73244


EXTRACT - EE Journal"Taking the Size and Power of Extreme Edge AI/ML to the Extreme Minimum" 21 Nov 2024.
View attachment 73246


Links
" “By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee.”
If Prophesee sells Qualcomm a product containing AKIDA there would likely be an NDA. Qualcomm may not release the product containing AKIDA until????
May well would be a wait for the cash to start rolling in. Not sure about the license situation in these cases.
We would not know until
 
  • Like
  • Fire
  • Love
Reactions: 10 users

Diogenese

Top 20
A TEMPORAL convolutional neural network was used... Tell me more!



Down, But Not Out​

This low-cost wrist-worn fall detector uses a Photon 2, accelerometer, and Edge Impulse to instantly alert first responders to an emergency.​


Nick BildFollow
5 hours ago • Wearables


EXTRACT

View attachment 73262



EXTRACT


View attachment 73263


EXTRACT

View attachment 73264





Temporal Convolutional Networks can be software:



The seminal work of Lea et al. (2016) first proposed a Temporal Convolutional Networks (TCNs) for video-based action segmentation. The two steps of this conventional process include: firstly, computing low-level features using (usually) CNN that encode spatial-temporal information and secondly, input these low-level features into a classifier that captures high-level temporal information using (usually) RNN. The main disadvantage of such an approach is that it requires two separate models. TCN provides a unified approach to capture all two levels of information hierarchically.
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Temporal Convolutional Networks can be software:



The seminal work of Lea et al. (2016) first proposed a Temporal Convolutional Networks (TCNs) for video-based action segmentation. The two steps of this conventional process include: firstly, computing low-level features using (usually) CNN that encode spatial-temporal information and secondly, input these low-level features into a classifier that captures high-level temporal information using (usually) RNN. The main disadvantage of such an approach is that it requires two separate models. TCN provides a unified approach to capture all two levels of information hierarchically.


Ahhhhhh! Very interesting...

Is that good or bad? Just asking for a friend!


e8efd22e67f1a803ab7953ee0614c861.gif
 
  • Haha
  • Love
  • Like
Reactions: 18 users

Diogenese

Top 20
Ahhhhhh! Very interesting...

Is that good or bad? Just asking for a friend!


View attachment 73265
Well, it's a 50/50 bet, but, given I've been cheering on Valeo and Mercedes SDV for TENNs, and given we're such good mates with EI, I hope it is TENNs software.

There are a number of different Temporal Convolutional Networks. As Ella reminds us,

Tain't what you do.
It's the way that you do it
.

TENNs does it with Convergent Orthogonal Polynomials.*

EI automate the classification of objects when building NN models. The models then have to be converted to N-of-M code format for Akida like we're doing with our other mates BeEmotion (nee Nviso)).

Automating model formation expedites the process of implemanting a NN. Converting the new models to spike/event format is necessary to adapt the new models for Akida, and applying N-of-M coding compresses the size of the model, thus improving latency and power consumption.

Akida 2 uses 4-bit events for the highest precision applications, but it can use 1-bit or 2-bit events where speed/power consumption are more important. I assume that TENNs is adapted or is being adapted to work with 1, 2, or 4 bits and N-of-M coding.

* Don't ask me.
 
  • Like
  • Fire
  • Love
Reactions: 23 users
Appears Tech Mahindra, like TCS did, also see the future including a ramp up in neuromorphic.

Be great if they're talking to us as well.

If BRN can't make some sort of inroads into what, on the surface, appears to be a increasing need, recognition and catalyst for neuromorphic, then we have some issues.

Snipped a couple of bits from link below.




Tech Mahindra

The Engine Behind AI: AI Accelerator​


November 19, 2024
:
The last few years have witnessed a profound shift in operational models across global industries. The reason? The exponential evolution of AI. Foundational models of AI in 2024 encompass various capabilities, including, but not limited to, computer code, images, video, and audio. As a direct impact, we can see a plethora of AI-assisted innovations in healthcare, automotive, finance, and more. A McKinsey study (2023) reveals that the new generations of AI can inject $2.6 – $4.4 trillion across all industries combined.

..........

The Future of AI Accelerators: What Lies Ahead?​

AI accelerators are still evolving, and their future promises even more powerful capabilities. Here are a few trends and innovations expected in the coming years:


  • Accelerators in Consumer Electronics
    Even more consumer devices will undoubtedly feature AI accelerators in the coming months. Whether a wearable device that is equipped with real-time health diagnostics or AR glasses analyzing surroundings in real-time, AI-based consumer electronics will change how we use technology in our lives. This trend dictates more AI accelerators in consumer electronics.
In addition to this, energy-efficient AI models are also gaining popularity. The high demand for the AI accelerator will lead to a substantial focus on the development of an energy-efficient model. These models can save operational costs while reducing the negative carbon footprint issues related to massive AI deployment. The major hardware companies will further improvise in developing architectures related to more power efficiency such as the use of neuromorphic computing and neural structures mimicking the human brain along with AI accelerators. So only one question remains. Are you ready for the future?
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Flenton

Regular
Don't normally make predictions anymore with this company because they always ended in disappointment but I've got a very strong feeling about it this week so here goes...
My prediction is nothing will be announced and the share price will fall.
 
  • Haha
  • Like
  • Fire
Reactions: 19 users

Slymeat

Move on, nothing to see.
Some years ago, the company said that obtaining ISO certification would be done by the vehicle maker. However, that does not relieve the company of responsibility for ensuring the designs are ISO compliant.

Then there is the continuous development of the tech which inhibits customers from committing to silicon - hence, SDV.

I wonder how long it takes to get ISO cert for software, and for software updates. It would require documenttion of the software and of every code change as well as exhaustive testing. I would think that software testing would not take as long as hardware testing.

Hopefully TENNs/Akida2 will be part of the SDV software in the near term. TENNs 01 software has been around for a couple of years and would have undergone extensive testing, but we don't know what the latest version is or when it was available, let alone what's still in the pipeline.
If you are talking about quality assurance, as in ISO 9001 and the likes, ISO certification is about the process only and can be achieved in a few months. I know first hand after developing several of them by myself (for telecommunication companies) and from being an accredited internal and external auditor for more than a decade.
 
  • Like
  • Love
  • Fire
Reactions: 7 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Screenshot 2024-11-25 at 10.39.34 am.png

Samsung's new Gauss 2 AI Model might be the next Galaxy brain​

News
By Eric Hal Schwartz
published 3 days ago
Gauss 2 could be the next Galaxy brain



Samsung Gauss 2

(Image credit: Samsung)

Samsung's AI ambitions have taken a big step forward as the company introduced the Gauss 2 AI model at this year's Samsung Developer Conference. Gauss 2 builds upon its predecessor by offering improved performance and efficiency, with applications spanning smartphones, tablets, laptops, and home appliances.

Your Samsung Galaxy S24 FE may not use it for its AI features, but the device you buy in a few years might use Gauss 2 to help you out, including the rumored automated adjustments that make the Settings menu obsolete.

Gauss 2 is multimodal, so the AI can simultaneously process images, text, and computer code. That makes it better at incorporating AI-driven features on devices. In fact, there are three versions of the new model, differing in size and ability: Compact, Balanced, and Supreme.

The Compact model is aimed at performing on a device without the internet. In contrast, the Balanced model sometimes needs online resources to process data but is still supposed to be fast and efficient. Lastly, the Supreme version of the model pulls in resources and algorithm variations as needed to offer the best performance.

Depending on the version, Samsung says that Gauss 2 can communicate in up to 14 languages and 1.5 to three times faster than its earlier iteration.

Gauss gassed​

“Samsung Electronics is committed to developing cutting-edge software, including AI and data analytics, to enhance user experiences,” said President and CTO of Samsung's Device eXperience (DX) Division and the head of Samsung Research Paul Kyungwhoon Cheun. “With three distinct models, Samsung Gauss2 is already boosting our internal productivity, and we plan to integrate it into products to deliver higher levels of convenience and personalization.”
Samsung said it has already deployed Gauss 2 internally. More than 60% of Samsung's DX division developers use Gauss 2 to help them code or get it to help translate text, write emails, and summarize documents. The AI is also used in call centers to categorize and summarize customer interactions.

 
  • Like
  • Fire
  • Love
Reactions: 12 users

buena suerte :-)

BOB Bank of Brainchip
Perth crew.
Yip its is that time of the year again for the obligatory xmas drinks.
Wednesday 11 December is booked for the 4pm-4.30pm start, I need numbers as they now take my credit card and charge $500 if I don't get the numbers (and not wing it like normal), so just PM if you are interest. Let us pray that because of the big break between drinks we can do as we did for the $1 dollar party and on the day (19 Jan2022) it was a $2 party.

View attachment 72948
Oh the memories!! What an exciting time that was.....It will happen again at some point! ($2+++) 🙏🙏🙏 Anyway @Earlyrelease count me in x2 see you there :) :) 🍷🍷🍷cheers
 
  • Like
  • Love
  • Fire
Reactions: 5 users

7für7

Top 20
Don't normally make predictions anymore with this company because they always ended in disappointment but I've got a very strong feeling about it this week so here goes...
My prediction is nothing will be announced and the share price will fall.

Let her do predictions for you

IMG_7735.jpeg
 
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
OK. What's the deal here? Why have we gone up nearly 9% today?


christian-bale.gif
 
  • Haha
Reactions: 1 users
Top Bottom