BRN Discussion Ongoing

TECH

Regular
Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.
 
  • Like
  • Fire
Reactions: 10 users
Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.
Hmm 🤔...

whoa-deja-vu-matrix-glitch.gif
 
  • Haha
  • Like
  • Wow
Reactions: 6 users

Frangipani

Top 20
Hi Frangipani

I believe BRN has just remapped the website

View attachment 71852

Hi miaeffect,

thank’s, but yours is a completely different BrainChip webpage, which has been around since 25 July 2022:

7C187623-C8FB-4A25-8202-9111C871BE1B.jpeg



July 25, 2022

Designing Smarter And Safer Cars With Essential AI​


Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need […]

Categories: Blog

By Admin​


Share​


Screen-Shot-2022-07-18-at-10.48.15-AM.png


Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.

That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).
brainchip-diagram1-300x118.png

In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.

AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.

Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.

Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.
Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.
Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.

Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.
Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.

Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.

Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.

To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.




I was actually referring to this BrainChip webpage:


which until at least mid-July still looked like this under “Automotive” (there are also other categories, namely Home/Industrial/Health and Wellness, but it is the Auto/Automotive one you’ll see first):

773D2460-B987-486B-BA71-2BF864565A34.jpeg





What I framed in red is now gone from that page…
 
Last edited:
  • Wow
Reactions: 1 users

sb182

Member

I’m just back to get some ammunition and holy water…. And something else…. If you see Yoda, tell him he should move his ass!

View attachment 71638
😔
Hopefully they clean the ears as well

View attachment 71642

The point is that he might not be up to date on the latest developments. Neither regarding BrainChip, nor Intel, because he no longer works there. I can easily imagine that internally at Intel he was already acting like a know-it-all. He’s just an Intel lohi fanboy who wants to talk down BrainChip. Additionally, maybe he’s a shorter? Who knows. His profile say “investor” maybe he is investing in an other company trying to down ramp openly? DH

Look……I don’t think they’ve got a smarter solution than our nice beautiful brains, okay? Huuuuuge difference. But………., never mind, to each their own …right? That’s true …We’re not living in a planned economy..We’re not living in little Rocket Man’s land!
BUT EVERYONE, BELIEVE ME, EVERYONE WILL WANT A PIECE OF OUR BRAIN… they’ll come… because it’s the only real brain… beautiful brain! AND OUR BRAIN WILL GET BIGGER AND STRONGER THAN ANY OTHER BRAIN… remember my words… it’s true!

View attachment 71696
Dog Dont Look At Me GIF by michaelmarczewski
 
  • Haha
  • Like
Reactions: 5 users

Slade

Top 20
  • Haha
Reactions: 4 users

Frangipani

Top 20
Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.

Hi Tech,

what’s the point of emailing the sales team in the US? They will only be able to tell us that they can’t comment on the progress of that relationship as it is subject to an NDA. Unless MB had unequivocally told them “Sorry, you’re out…”, they may not even know themselves where exactly they stand with them.

The points I raised suggest to me that Mercedes-Benz are still weighing their options. This doesn’t necessarily mean we are no longer engaged with them. Yet, it is an indisputable fact we are not the only ones in the neuromorphic space presently engaged with them.

What we can say for sure, though, is that not all of those engagements will ultimately translate into the signing of an IP license.
And so we wait…

Regards
Frangipani
 
  • Like
  • Fire
Reactions: 12 users

Dallas

Regular
  • Like
  • Love
  • Fire
Reactions: 22 users

FiveBucks

Regular
And another week goes by without a contract announcement.

2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.

Tick, tock.
 
  • Like
  • Sad
  • Thinking
Reactions: 14 users

Frangipani

Top 20
A video going along with that paper was uploaded to YouTube yesterday:



Both paper and video relate to another paper and video published by the same Uni Tübingen authors earlier this year. At a cursory glance, at least the videos (posted about six months apart) appear to be VERY similar:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900


View attachment 70372
View attachment 70373


Now compare the slides to those in the video uploaded October 3:

View attachment 70368


View attachment 70369

View attachment 70370

In fact, when I just tried to cursorily compare the new paper to the March 15 paper that @Fullmoonfever had linked at the time (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416313), I discovered that the link he had posted then now connects directly to this new paper, published on September 16, so it seems to be an updated version of the previous paper.

I did notice the addition of another co-author, though: Sebastian Otte, who used to be a PhD student and postdoc at Uni Tübingen (2013-2023) and became Professor at Uni Lübeck’s Institute for Robotics and Cognitive Systems just over a year ago, where he heads the Adaptive AI research group.

0d00f748-f1ff-44f9-be7c-849d5e0b8583-jpeg.70378



To put the results that our competitors’ neuromorphic offerings fared worse in the benchmarking tests alongside Akida somewhat into perspective:
In all fairness, it should be highlighted that Akida’s superiority was at least partly due to the fact that AKD1000 is available as a PCIe Board, whereas SynSense’s DynapCNN was connected to the PC via USB and - as the excerpt Gazzafish already posted shows - the researchers did not have direct access to a Loihi 2 edge device, but merely through a virtual machine provided by Intel via their Neuromorphic Research Cloud. The benchmarking would obviously yield better comparable results if the actual hardware used were of a similar form factor:

“Our results show that the better a neuromorphic edge device is connected to the main compute unit, e.g., as a PCIe card, the better the overall run-time.”


Anyway, Akida undoubtedly impressed the researchers, and as a result they are considering further experiments: “(…) future work could involve evaluating the system with an additional Akida PCIe card.”


View attachment 70374


In an earlier post (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-426404), I had already mentioned that the paper’s first author, Andreas Ziegler, who is doing a PhD in robotics and computer vision at Uni Tübingen, has meanwhile completed his internship at Sony AI in Switzerland (that - as we know - partially funded the paper’s research):

View attachment 70375


Fun fact: One of his co-authors, Karl Vetter, however, is no longer with Uni Tübingen’s Cognitive Systems Lab, but has since moved to France, where he has been working as a research engineer for…

🥁 🥁 🥁 Neurobus for the past three months!
It’s a small world, isn’t it?! 😉

View attachment 70376
View attachment 70377


Three days ago, first author Andreas Ziegler gave a talk on the recent table tennis robot research conducted at Uni Tübingen 👆🏻 during the Neuromorphic Vision Hackathon at ZHAW (Zürcher Hochschule für Angewandte Wissenschaften / Zurich University of Applied Sciences), where robotics and neuromorphic computing expert Yulia Sandamirskaya (ex Intel Labs) heads the Research Centre “Cognitive Computing in Life Sciences” at ZHAW’s Wädenswil campus.

56D0AC50-F2AB-4F1E-9676-49F80A3BB562.jpeg



While the content of his presentation is not new for those of you who already read the paper or saw the video, I thought the way he presented it was quite cool, with all the embedded videos! Have a look yourselves:


Anyway, more exposure for Akida and those favourable benchmarking results (even though it is unclear how much influence the hardware’s form factor had, see my post above).



Here are some of the presentation slides:

D97A42A0-D3DF-43BF-AF67-702C59211B0B.jpeg


79EE8ADC-0203-489E-909B-9E1F7F644F51.jpeg

8288B92E-4FC0-4AD9-AD66-931543D93CDA.jpeg



A5E76D5C-17FF-4C68-A1DD-D2335E8D73BA.jpeg


21A7DD72-185F-4EE5-8F74-BFA1E8E92ABB.jpeg


C2D2DF1C-4131-4D19-B448-EF6F0F244563.jpeg


E308335D-3A07-44BE-BB8C-0224EEE8E101.jpeg


In Andreas Ziegler’s updated CV (https://andreasaziegler.github.io/), we can now see who his supervisors were during his internship at Sony AI (that funded this research): Raphaela Kreiser and Nagoya Takahashi:

0DF9B55B-E44D-4587-BAED-8B9F65678FEA.jpeg



B01F373E-1189-4592-B100-7A09796B48C3.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2024-10-26 at 8.56.51 am.png
 
  • Like
  • Love
  • Thinking
Reactions: 31 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Thinking
Reactions: 26 users

manny100

Top 20
Probably been posted before. Explains PICO quite well.
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\
 
  • Like
  • Love
  • Thinking
Reactions: 15 users

manny100

Top 20
Sort of explains the ARM/Qualcomm dispute pretty well.
There has been a fair bit of media chat about Qualcomm using RISC-V of late.
A combination of RISC-V and PICO could prove to be a powerful and cost effective solution. ARM however does have a reliable and strong eco support system already in place.
Will we see a move away from ARM for mobiles and Wearables?
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 15 users

Guzzi62

Regular
AI is already in phones:

The AI that's already in your phone​

We’re slowly getting used to Artificial Intelligence doing uncannily human things - chatting with us, creating pictures and videos. But so far, all of this AI has used a lot of computing power.
In the last year or so, we’ve seen a new type of computer chip made specifically for AI, and your mobile phone. Tech reporter, Spencer Kelly has been testing some of the latest AI features available to us.

 
  • Like
  • Fire
  • Wow
Reactions: 6 users
  • Like
  • Fire
  • Love
Reactions: 38 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
In the past we have partnered with the likes of MagicEye on 3D object detection. I wonder if this announcement below involving our partner, Tata Exlsi, will open up a pathway to working with Forseight Autonomous Holdings on their 3D perception solutions?




Foresight and Tata Elxsi Sign Collaboration Agreement​

October 25, 2024 08:21 ET| Source: Foresight Autonomous Holdings Ltd.Follow


  • The parties will collaborate to accelerate development of solutions for semi-autonomous and autonomous vehicles using Foresight’s stereoscopic technology and Tata Elxsi’s integration solutions for marketing in the Indian automotive industry
Ness Ziona, Israel, Oct. 25, 2024 (GLOBE NEWSWIRE) -- Foresight Autonomous Holdings Ltd. (Nasdaq and TASE: FRSX) (“Foresight” or the “Company”), an innovator in automotive vision systems, announced today the signing of multi-phase collaboration agreement with Tata Elxsi Limited (“Tata Elxsi”), a leading global tier-one supplier of design and technology services, providing solutions across various industries, including the automotive, broadcast, communications, healthcare, and transportation industries.
The initial phase will include the development and commercialization of advanced solutions for advanced driver assistance systems (ADAS). These will be integrated into passenger vehicles, heavy machinery and agricultural vehicles manufactured by Tata Motors. Building on the success of the ADAS implementations, the parties will continue to develop and commercialize advanced services for semi and fully autonomous features to be integrated into various applications within the automotive industry.
Tata Elxsi will introduce and promote Foresight’s 3D perception solutions to its diverse customer base, starting with the Indian automotive industry, and subsequently targeting global automotive vehicle manufacturers. Furthermore, during the first half of 2025, Tata Elxsi plans to promote Foresight’s solutions to its existing customers in the heavy machinery and agriculture sectors.
Foresight’s 3D perception solutions are based on stereoscopic technology, using both visible-light and thermal infrared cameras, and proprietary algorithms to detect all objects and create high resolution 3D point clouds.
“We are excited to collaborate with Tata Elxsi to bring our advanced 3D perception technology to the Indian automotive industry. We believe that this collaboration will help us expand our footprint in the emerging Indian market, including in autonomous passenger vehicles, heavy machinery and agricultural equipment, thereby leading to safer and more efficient transportation options across India,” said Oren Bar-On, Chief Executive Officer of Foresight Asia.

 
  • Like
  • Love
  • Fire
Reactions: 20 users
And another week goes by without a contract announcement.

2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.

Tick, tock.
Maybe he wasn’t taking about deals but deals

1729921998369.gif
 
  • Haha
  • Like
Reactions: 9 users

BrainShit

Regular
Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Fully agreed... Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.
 
  • Like
Reactions: 1 users

BrainShit

Regular
Probably been posted before. Explains PICO quite well.
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\

Sort of explains the ARM/Qualcomm dispute pretty well.
There has been a fair bit of media chat about Qualcomm using RISC-V of late.
A combination of RISC-V and PICO could prove to be a powerful and cost effective solution. ARM however does have a reliable and strong eco support system already in place.
Will we see a move away from ARM for mobiles and Wearables?

I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.

If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.

*RISC-V and ARM use different Instruction Set Architectures (ISAs).
 
  • Like
  • Thinking
Reactions: 7 users

manny100

Top 20
I don't think Qualcomm's customers would welcome a technology / architecture change so easy. All products would have to be re-developed, adapted, implemented and retested*... we're at the same point with Akida. Guess why no product is out there and we got just research- / evaluation projects? Except the beacon from ANT61 and the Cup cake from Unigen... but these are just single and small unit solutions.

If Qualcomm would be able to cross out ARM and switches to RISC-V ... that would an great enabler for BrainChip. But I guarantee, Qualcomm will never do that in the next short-term years.

*RISC-V and ARM use different Instruction Set Architectures (ISAs).
May well be but then Qualcomm will need to cave in and bow to ARM's demands. It will be interesting to see how this pans out?
Is Qualcomm bluffing? RISC-V is free and ARM is trying to screw Qualcomm.
See link to article outlining Qualcomm's plans titled

"Qualcomm VP discusses its 'next' chip for Wear OS watches"​

" Qualcomm and Google are "working on it" right now, and I'm assuming 2025 is the target to have RISC-V software optimized."
 
  • Like
  • Fire
  • Love
Reactions: 8 users
Top Bottom