BRN Discussion Ongoing

JB49

Regular
Is this Akida spot or old spot being put to work?

 
  • Thinking
  • Fire
Reactions: 3 users

FJ-215

Regular
  • Thinking
Reactions: 1 users

Frangipani

Regular
Hi @Frangipani,

You mention you don’t expect neuromorphic technology in any MB serial production cars in the near future.

If that's the case, I'm just curious as to what you make of Mercedes' statement as quoted within Sally Ward-Foxton's article "Cars That Think Like You"? thought the quote indicated it would not be all that far off into the future, hence the use of the word "just" in the expression "just a few years".

I'd have thought that given the article was published on 22 September 2022, we are now getting closer to the point where "just a few years' will have expired and in addition to this, it doesn't seem all that inconceivable, at least in my opinion, for MB to utilise BrainChip's technology in the nearer-term, particularly given TENN's and the recent launch of Pico.

Interested to hear your thoughts.


View attachment 71358




Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

FFB2D6F1-BC2C-4D46-B2A1-00E011555C36.jpeg



However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

9DA65F9E-88C5-48A5-B13C-B1F72D6ECCF1.jpeg


EA7F957A-B286-4CF7-A382-F88BA8679CF3.jpeg




And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • DA0496A7-7BEC-4044-8625-4AC10D9ADBD1.jpeg


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

E2DFF097-CAD3-4D7E-BCE9-63AF02E4A419.jpeg


58BB8966-0C51-4B81-9262-206652F05EF9.jpeg



  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
01BD1091-F575-4761-9D4B-D89CA3E6719F.jpeg


  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
A2BF2D01-F7A1-4197-BF1B-6E541402F3C4.jpeg


And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.
 
  • Like
  • Thinking
  • Fire
Reactions: 22 users

Frangipani

Regular
Hi Krustor,

IMO that link - which originally connected to the MB Jan 3, 2022 press release titled Vision EQXX - taking electric range and efficiency to an entirely new level
(now to be found under https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level) - simply links to the general Mercedes media/press release page, so it also happens to link to that ESG Conference 2024 livestream these days. It’s not a deliberate link by Brainchip to that particular event on March 20.

Schönen Abend noch,
Frangipani

I am well familiar to the BRN-homepage since years. This link to a March 20 event is absolutely new and is not connected only to the Jan. 2022 release. It is indeed connected to the march 20 2024 event.

I have absolutely no hope for a Release or something of this kind because of it. I just find that interesting. Lets see. Thats all.

Schönen Abend dir auch

No reason to discuss with you: There was no previous link - neither to any Merc page directly nor to this countdown.

No matter how many lines you want to write and try to state the opposite: Not the way you want is to believe here.

This is also not meant to adresse you personally as we all know how this ends... greets to @Fact Finder

Coincidentally, my Wayback Machine research regarding the VISION EQXX also came up with proof that @Krustor wasn’t as familiar with the BrainChip website as he claimed back in March - turns out there had in fact been a link to MB on that BrainChip webpage for over a year prior to his rude reply to me, despite him insisting there hadn’t been any previous link:

0BE3E77D-4832-4AC7-BFBF-A9FA9F3AE288.jpeg




Let the facts speak for themselves…


808C57D9-30F4-4A7A-9099-FC24F4C3DDBB.jpeg
 
  • Like
  • Fire
Reactions: 7 users

Mt09

Regular
Coincidentally, my Wayback Machine research regarding the VISION EQXX also came up with proof that @Krustor wasn’t as familiar with the BrainChip website as he claimed back in March - turns out there had in fact been a link to MB on that BrainChip webpage for over a year prior to his rude reply to me, despite him insisting there hadn’t been any previous link:

View attachment 71847



Let the facts speak for themselves…


View attachment 71848
Wouldn’t loose sleep over it..
 
  • Like
Reactions: 5 users

IloveLamp

Top 20
1000019300.gif
 
  • Haha
  • Like
Reactions: 7 users

TECH

Regular
Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged
 
  • Like
Reactions: 13 users

jtardif999

Regular
Sean need to address his share holders as this is just about enough for everyone.
It’s been months since the AGM
It meant to happen before years end yet next week is November and nothing
We deserve to be updated Sean

Is it just me or do others feel this way

I am just feeling disappointed atm
Hopefully this will pass I am usually more positive
Sean are we there yet? Are we there yet Sean? Hey Sean, are we there yet?
 
  • Haha
  • Like
Reactions: 11 users

TECH

Regular
Hi @Bravo,

the statement by Mercedes-Benz that Sally Ward-Foxton quoted in said September 2022 article was plucked straight from the 3 January 2022 VISION EQXX press release (https://media.mbusa.com/releases/re...range-and-efficiency-to-an-entirely-new-level), so it is now almost three years old.

View attachment 71758


However, that press release is nowhere to be found on the official webpage dedicated to the VISION EQXX (https://group.mercedes-benz.com/innovation/product-innovation/technology/vision-eqxx.html) - surprisingly, there is no reference to neuromorphic computing at all on that page.
This equally holds true for the German version of the VISION EQXX webpage: https://group.mercedes-benz.com/innovation/produktinnovation/technologie/vision-eqxx.html

In fact, there hasn’t been any reference whatsoever to neuromorphic computing on that webpage since April 4, 2022, as I was able to establish thanks to the Wayback Machine (https://web.archive.org - a cool internet archive that I stumbled upon the other day, which allows you to “Explore more than 916 billion web pages saved over time”): I can go back to the German version of that webpage and see that on 1 April 2022, MB still mentioned “Elemente der Benutzeroberfläche unterstützen die nahtlose Interaktion zwischen Fahrer und Fahrzeug. Unter anderem durch Künstliche Intelligenz (KI), die die Funktionsweise des menschlichen Gehirns nachahmt.” (“Elements of the user interface support seamless interaction between driver and vehicle. This includes Artificial Intelligence (AI) which mimics the way the human brain works.”) The webpage’s content was soon after replaced with a new text dated 4 April 2022 that no longer referred to brain-inspired/neuromorphic AI whatsoever. It has since been updated with links to articles about the VISION EQXX’s second and third long distant road trips over 1000 km in June 2022 (Stuttgart to Silverstone) and March 2024 (Riyadh to Dubai).


It is anyone’s guess why MB decided to no longer mention that the keyword spotting in their VISION EQXX concept car had been exceptionally energy-efficient due to it having been implemented on a neuromorphic chip (let alone on which one specifically), although they obviously continue to take great interest in this disruptive tech.

Did they possibly come to realise that it would take much longer to implement neuromorphic technology at scale than originally envisioned? Either from a technical perspective and/or from a legal one (automotive grade ISO certification etc)?

Did they at the time possibly not foresee the growing number of competitors in the neuromorphic space besides BrainChip and Intel that could equally be of interest to them and which they would now first like to explore in depth before making any far-reaching decisions?

I also happened to notice that the reference to the VISION EQXX on https://brainchip.com/markets has been deleted. Thanks to the Wayback Machine, we can tell that this must have happened sometime between mid-July and August 25.
The question is: Why was that reference (consisting of a picture of the MB concept car that we know utilised Akida as well as the relevant press release excerpt) taken down from the BrainChip website? 🤔

Doesn’t this strike you as odd, despite the Mercedes logo still being displayed on our landing page under “YOU’RE IN GOOD COMPANY”?

View attachment 71814

View attachment 71815



And how about the other points I raised in previous posts, such as

  • the word potential in “positioning for a potential project collaboration with Mercedes” showing up in a 2023 BrainChip summer intern’s CV, which - as I already argued in January - suggested to me that Mercedes must have been weighing their options and were evaluating more than one neuromorphic processor last year? (Well, I feel vindicated, since they certainly were, as evidenced by MB’s recent announcement regarding research collaborations with both Intel and other consortium members of the NAOMI4Radar project (based on Loihi 2) on the one hand and with the University of Waterloo on the other hand, where the research will be led by Chris Eliasmith, who is co-founder and CTO of Applied Brain Research, a company that recently released their TSP1, which is “a single-chip solution for time series inference applications like real-time speech recognition (including keyword spotting), realistic text-to-speech synthesis, natural language control interfaces and other advanced sensor fusion applications.”) https://www.appliedbrainresearch.co...lution-for-full-vocabulary-speech-recognition

  • the June 2024 MB job listing for a “Working student position in the field of Machine Learning & Neuromorphic Computing from August 2024” that mentioned “working with new chip technologies” and “deployment on neuromorphic chips”?

  • the below comment by Magnus Östberg (“We are looking at all suitable solutions!”) after a BRN shareholder had expressed his hope that MB would be implementing BrainChip technology into their vehicles soon?
  • View attachment 71816


  • the fact that we are not the only neuromorphic tech company that has the Mercedes-Benz logo prominently displayed on their website or on public presentation slides?

View attachment 71817

View attachment 71818


  • the fact that earlier this year Innatera’s CEO Sumeet Kumar got likes on LinkedIn from two of the MB neuromorphic engineers - Gerrit Ecke and Alexander Janisch - after suggesting MB should also talk to Innatera regarding neuromorphic computing?
View attachment 71824

  • the reference to NMC (neuromorphic computing) being considered a “möglicher Lösungsweg” (possible/potential solution) in the recent presentation at Hochschule Karlsruhe by MB engineer Dominik Blum
    (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-439352) and the table with several competing neuromorphic hardware offerings in one of his presentation slides titled “Neuromorphic computing is a young field of research … with a lot of open questions, e.g.:”
View attachment 71822

And then there is also a recent Master’s thesis sort of connected to MB’s neuromorphic research (more on that later, as posts have a limit of 10 upload images) that strengthens my belief that MB are still weighing their options…


Lots of points raised that cannot simply be glossed over and that suggest to me Mercedes-Benz is nowhere near to implementing neuromorphic technology at scale into serial cars.

Interested to hear your or anyone else’s thoughts on those points.

Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.
 
  • Like
  • Fire
Reactions: 10 users
Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.
Hmm 🤔...

whoa-deja-vu-matrix-glitch.gif
 
  • Haha
  • Like
  • Wow
Reactions: 6 users

Frangipani

Regular
Hi Frangipani

I believe BRN has just remapped the website

View attachment 71852

Hi miaeffect,

thank’s, but yours is a completely different BrainChip webpage, which has been around since 25 July 2022:

7C187623-C8FB-4A25-8202-9111C871BE1B.jpeg



July 25, 2022

Designing Smarter And Safer Cars With Essential AI​


Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need […]

Categories: Blog

By Admin​


Share​


Screen-Shot-2022-07-18-at-10.48.15-AM.png


Conventional AI silicon and cloud-centric inference models do not perform efficiently at the automotive edge. As many semiconductor companies have already realized, latency and power are two primary issues that must be effectively addressed before the automotive industry can manufacture a new generation of smarter and safer cars. To meet consumer expectations, these vehicles need to feature highly personalized and responsive in-cabin systems while supporting advanced assisted driving capabilities.

That’s why automotive companies are untethering edge AI functions from the cloud – and performing distributed inference computation on local neuromorphic silicon using BrainChip’s AKIDA. This Essential AI model sequentially leverages multiple AKIDA-powered smart sensors and AKIDA AI SoC accelerators to efficiently capture and analyze inference data within designated regions of interest or ROI (samples within a data set that provide value for inference).
brainchip-diagram1-300x118.png

In-Cabin Experience
According to McKinsey analysts, the in-cabin experience is poised to become one of the most important differentiators for new car buyers. With AKIDA, automotive companies are designing lighter, faster, and more energy efficient in-cabin systems that can act independent of the cloud. These include advanced facial detection and customization systems that automatically adjust seats, mirrors, infotainment settings, and interior temperatures to match driver preferences.

AKIDA also enables sophisticated voice control technology that instantly responds to commands – as well as gaze estimation and emotion classification systems that proactively prompt drivers to focus on the road. Indeed, the Mercedes-Benz Vision EQXX features AKIDA-powered neuromorphic AI voice control technology which is five to ten times more efficient than conventional systems.

Neuromorphic silicon – which processes data with efficiency, precision, and economy of energy – is playing a major role in transforming vehicles into transportation capsules with personalized features and applications to accommodate both work and entertainment. This evolution is driven by smart sensors capturing yottabytes of data from the automotive edge to create holistic and immersive in-cabin experiences.

Assisted Driving – Computer Vision
In addition to redefining the in-cabin experience, AKIDA allows advanced driver assistance systems (ADAS) such as computer vision to detect vehicles, pedestrians, bicyclists, signs, and objects with incredibly high levels of precision.
Specifically, pairing two-stage object detection inference algorithms with local AKIDA silicon enables computer vision systems to efficiently perform processing in two primary stages – at the sensor (inference) and AI accelerator (classification). Using this paradigm, computer vision systems rapidly and efficiently analyze vast amounts of inference data within specific ROIs.
Intelligently refining inference data eliminates the need for compute heavy hardware such as general-purpose CPUs and GPUs that draw considerable amounts of power and increase the size and weight of computer vision systems. It also allows ADAS to generate incredibly detailed real-time 3D maps that help drivers safely navigate busy roads and highways.

Assisted Driving – LiDAR
Automotive companies are also leveraging a sequential computation model with AKIDA-powered smart sensors and AI accelerators to enable the design of new LiDAR systems. This is because most LiDAR typically rely on general-purpose GPUs – and run cloud-centric, compute-heavy inference models that demand a large carbon footprint to process enormous amounts of data.
Indeed, LiDAR sensors typically fire 8 to 108 laser beams in a series of cyclical pulses, each emitting billions of photons per second. These beams bounce off objects and are analyzed to identify and classify vehicles, pedestrians, animals, and street signs. With AKIDA, LiDAR processes millions of data points simultaneously, using only minimal amounts of compute power to accurately detect – and classify – moving and stationary objects with equal levels of precision.

Limiting inference to a ROI helps automotive manufacturers eliminate compute and energy heavy hardware such as general-purpose CPUs and GPUs in LiDAR systems – and accelerate the rollout of advanced assisted driving capabilities.

Essential AI
Scalable AKIDA-powered smart sensors bring common sense to the processing of automotive data – freeing ADAS and in-cabin systems to do more with less by allowing them to infer the big picture from the basics. With AKIDA, automotive manufacturers are designing sophisticated edge AI systems that deliver immersive end-user experiences, support the ever-increasing data and compute requirements of assisted driving capabilities, and enable the self-driving cars and trucks of the future.

To learn more about how BrainChip brings Essential AI to the next generation of smarter cars, download our new white paper here.




I was actually referring to this BrainChip webpage:


which until at least mid-July still looked like this under “Automotive” (there are also other categories, namely Home/Industrial/Health and Wellness, but it is the Auto/Automotive one you’ll see first):

773D2460-B987-486B-BA71-2BF864565A34.jpeg





What I framed in red is now gone from that page…
 
Last edited:
  • Wow
Reactions: 1 users

sb182

Member

I’m just back to get some ammunition and holy water…. And something else…. If you see Yoda, tell him he should move his ass!

View attachment 71638
😔
Hopefully they clean the ears as well

View attachment 71642

The point is that he might not be up to date on the latest developments. Neither regarding BrainChip, nor Intel, because he no longer works there. I can easily imagine that internally at Intel he was already acting like a know-it-all. He’s just an Intel lohi fanboy who wants to talk down BrainChip. Additionally, maybe he’s a shorter? Who knows. His profile say “investor” maybe he is investing in an other company trying to down ramp openly? DH

Look……I don’t think they’ve got a smarter solution than our nice beautiful brains, okay? Huuuuuge difference. But………., never mind, to each their own …right? That’s true …We’re not living in a planned economy..We’re not living in little Rocket Man’s land!
BUT EVERYONE, BELIEVE ME, EVERYONE WILL WANT A PIECE OF OUR BRAIN… they’ll come… because it’s the only real brain… beautiful brain! AND OUR BRAIN WILL GET BIGGER AND STRONGER THAN ANY OTHER BRAIN… remember my words… it’s true!

View attachment 71696
Dog Dont Look At Me GIF by michaelmarczewski
 
  • Haha
  • Like
Reactions: 5 users

Frangipani

Regular
Excellent research !...all I would say is, yes, of course Mercedes Benz are testing every available ground-breaking technology, and yes we are still engaged based on the fact I was told they were a client...if we aren't currently engaged with Mercedes well then, the company needs to be held to account for misrepresentation on our home page...Please email the sales team in the US to clarify the current position.

Speculation can be silenced very quickly.

Regards...Tech.

Hi Tech,

what’s the point of emailing the sales team in the US? They will only be able to tell us that they can’t comment on the progress of that relationship as it is subject to an NDA. Unless MB had unequivocally told them “Sorry, you’re out…”, they may not even know themselves where exactly they stand with them.

The points I raised suggest to me that Mercedes-Benz are still weighing their options. This doesn’t necessarily mean we are no longer engaged with them. Yet, it is an indisputable fact we are not the only ones in the neuromorphic space presently engaged with them.

What we can say for sure, though, is that not all of those engagements will ultimately translate into the signing of an IP license.
And so we wait…

Regards
Frangipani
 
  • Like
  • Fire
Reactions: 12 users

Dallas

Regular
  • Like
  • Love
  • Fire
Reactions: 22 users

FiveBucks

Regular
And another week goes by without a contract announcement.

2 months left for Sean to live up to his AGM statement of having deals done by the end of the year.

Tick, tock.
 
  • Like
  • Sad
  • Thinking
Reactions: 14 users

Frangipani

Regular
A video going along with that paper was uploaded to YouTube yesterday:



Both paper and video relate to another paper and video published by the same Uni Tübingen authors earlier this year. At a cursory glance, at least the videos (posted about six months apart) appear to be VERY similar:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416900


View attachment 70372
View attachment 70373


Now compare the slides to those in the video uploaded October 3:

View attachment 70368


View attachment 70369

View attachment 70370

In fact, when I just tried to cursorily compare the new paper to the March 15 paper that @Fullmoonfever had linked at the time (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-416313), I discovered that the link he had posted then now connects directly to this new paper, published on September 16, so it seems to be an updated version of the previous paper.

I did notice the addition of another co-author, though: Sebastian Otte, who used to be a PhD student and postdoc at Uni Tübingen (2013-2023) and became Professor at Uni Lübeck’s Institute for Robotics and Cognitive Systems just over a year ago, where he heads the Adaptive AI research group.

0d00f748-f1ff-44f9-be7c-849d5e0b8583-jpeg.70378



To put the results that our competitors’ neuromorphic offerings fared worse in the benchmarking tests alongside Akida somewhat into perspective:
In all fairness, it should be highlighted that Akida’s superiority was at least partly due to the fact that AKD1000 is available as a PCIe Board, whereas SynSense’s DynapCNN was connected to the PC via USB and - as the excerpt Gazzafish already posted shows - the researchers did not have direct access to a Loihi 2 edge device, but merely through a virtual machine provided by Intel via their Neuromorphic Research Cloud. The benchmarking would obviously yield better comparable results if the actual hardware used were of a similar form factor:

“Our results show that the better a neuromorphic edge device is connected to the main compute unit, e.g., as a PCIe card, the better the overall run-time.”


Anyway, Akida undoubtedly impressed the researchers, and as a result they are considering further experiments: “(…) future work could involve evaluating the system with an additional Akida PCIe card.”


View attachment 70374


In an earlier post (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-426404), I had already mentioned that the paper’s first author, Andreas Ziegler, who is doing a PhD in robotics and computer vision at Uni Tübingen, has meanwhile completed his internship at Sony AI in Switzerland (that - as we know - partially funded the paper’s research):

View attachment 70375


Fun fact: One of his co-authors, Karl Vetter, however, is no longer with Uni Tübingen’s Cognitive Systems Lab, but has since moved to France, where he has been working as a research engineer for…

🥁 🥁 🥁 Neurobus for the past three months!
It’s a small world, isn’t it?! 😉

View attachment 70376
View attachment 70377


Three days ago, first author Andreas Ziegler gave a talk on the recent table tennis robot research conducted at Uni Tübingen 👆🏻 during the Neuromorphic Vision Hackathon at ZHAW (Zürcher Hochschule für Angewandte Wissenschaften / Zurich University of Applied Sciences), where robotics and neuromorphic computing expert Yulia Sandamirskaya (ex Intel Labs) heads the Research Centre “Cognitive Computing in Life Sciences” at ZHAW’s Wädenswil campus.

56D0AC50-F2AB-4F1E-9676-49F80A3BB562.jpeg



While the content of his presentation is not new for those of you who already read the paper or saw the video, I thought the way he presented it was quite cool, with all the embedded videos! Have a look yourselves:


Anyway, more exposure for Akida and those favourable benchmarking results (even though it is unclear how much influence the hardware’s form factor had, see my post above).



Here are some of the presentation slides:

D97A42A0-D3DF-43BF-AF67-702C59211B0B.jpeg


79EE8ADC-0203-489E-909B-9E1F7F644F51.jpeg

8288B92E-4FC0-4AD9-AD66-931543D93CDA.jpeg



A5E76D5C-17FF-4C68-A1DD-D2335E8D73BA.jpeg


21A7DD72-185F-4EE5-8F74-BFA1E8E92ABB.jpeg


C2D2DF1C-4131-4D19-B448-EF6F0F244563.jpeg


E308335D-3A07-44BE-BB8C-0224EEE8E101.jpeg


In Andreas Ziegler’s updated CV (https://andreasaziegler.github.io/), we can now see who his supervisors were during his internship at Sony AI (that funded this research): Raphaela Kreiser and Nagoya Takahashi:

0DF9B55B-E44D-4587-BAED-8B9F65678FEA.jpeg



B01F373E-1189-4592-B100-7A09796B48C3.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Screenshot 2024-10-26 at 8.56.51 am.png
 
  • Like
  • Love
  • Thinking
Reactions: 31 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Like
  • Love
  • Thinking
Reactions: 26 users

manny100

Regular
Probably been posted before. Explains PICO quite well.
Those that can 'speak and understand tech' can correct me but it appears that for mobile phones and wearables ARM is just a popular choice rather than essential.
RISC-V is a less common alternative but being an open source is free. On the other hand ARM is optimized for mobiles etc.
How would PICO/AKIDA work with RISC-V?
Is Qualcomm talking RISC-V purely as a threat to ARM??? Or would they really contemplate an RISC-V/PICO combo???\
 
  • Like
  • Love
  • Thinking
Reactions: 15 users
Top Bottom