Now let revisit
AI-generated routines soon to be in the new Mercedes.
View attachment 32906
"
Just like a fine wine" the car's software will
gets better over time.
So just think about all that's for a minute.
I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).
But remember this?
"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."
The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and ample
www.nextplatform.com
Brainchip's Akida technology is to make "sensor smarter". So with Mercedes
AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing
driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's '
gets better over time.
Just my intoxicated thought (DYOR)
-------
On another note. In the Next Platform article;
"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."
Good old MosChip...
Learning 🏖
Interesting
MAR 22, 2023
Robocar No Longer Drives Nvidia GTC
At Nvidia’s GPU Technology Conference (GTC) this week, the shiniest new object was ChatGPT. Autonomous driving is already an old story.
What’s at stake:
For years, Nvidia has hung its hat on autonomous driving as the linchpin of its AI technologies. However, that revenue stream is waning, not because AV is a solved problem but because it is just too hard a problem to solve. What’s the next big AI application?
GTC, put together by Nvidia, is one of the world’s premier conferences dedicated to AI developers. Nvidia has used the venue to showcase its AI prowess built on GPU technology.
For several years, AI-enabled autonomous driving has highlighted every GTC. Nvidia presented its AI solutions — deployed in data centers for AI training and its multi-thousand teraflops SoC inside vehicles’ central brains doing AI inference.
However, it was evident in a pre-GTC briefing this week that Nvidia has begun singing a markedly different tune on fully automated driving.
Fully autonomous vehicles are ‘the next decade’s problems.’
Ali Kani
Offering a more measured and somewhat skeptical view on autonomous driving – unprecedented for Nvidia – Ali Kani, vice president and general manager of the automotive team, called fully autonomous vehicles “the next decade’s problems.”
Given that “amazing companies with multibillion dollar investment, like Argo.ai, have gone under,” Kani said that it is “very reasonable” for OEMs to focus on making ADAS better and treating L4 as a long-term goal.
Further, he cautioned that autonomous driving “isn’t like AI development demo – as a kind of a fun little thing I can show.” People’s lives are at stake, and “there is no room for error,” he added.
“Robotaxi companies, trying to make it work in one city, are having challenges … and they are the best tech companies in the world,” he said. To make matters worse, “they’re trying to solve the problem worldwide in a passenger car … You can’t map that world every week. It’s impossible. What they must solve is an entirely different problem an order of magnitude harder.”
In other words, reality bites – even at GTC where AI always seems invincible.
The impressive capabilities of generative AI have created a sense of urgency.
Jensen Huang
ChatGPT, here it comes
Fast forward to the keynote speech Nvidia founder and CEO Jensen Huang delivered Tuesday.
Huang stressed, “the impressive capabilities of generative AI have created a sense of urgency” among companies eager to reimagine products and business models.
With computing advancing at “lightspeed,” Huang discussed a broad set of partnerships crucial to build an ecosystem for everything from training to deployment for cutting-edge AI services. These encompass Google, Microsoft, Oracle and a range of leading businesses that bring new AI, simulation and collaboration capabilities to every industry.
In other words, shelve autonomous driving, think ChatGPT.
Nvidia’s automotive business
Put simply, Nvidia is no longer counting on AVs to sell its automotive SoCs.
Instead, it has thrown an arsenal of technologies at automakers, urging them to use AI to support software development and plan new automotive plants.
The pitch at GTC to the automotive industry was Nvidia’s Omniverse platform and AI. Nvidia is now saying that automakers can leverage both Omniverse and AI to improve their “product lifecycle from concepts and styling, design and engineering, software and electronics, smart factories, autonomous driving and retail.”
According to announcements at GTC, Nvidia is working with the BMW Group – not on the vehicle front, but on vehicle production infrastructure. By building and operating industrial metaverse applications across BMW Group’s production network, Nvidia claims that it is helping BMW on an electric vehicle plant in Debrecen, Hungary, projected to open in 2025.
As for the central compute SoC, Nvidia’s automotive SoCs (such as Orin or Thor) won’t be powering BMW’s next-generation models. Instead, it will be Qualcomm’s automotive SoC platform.
Nonetheless, one big design win Nvidia announced at GTC this week is BYD. China’s leading EV company is building its mainstream vehicles on Nvidia’s Drive Orin SoC.
Other tier ones and OEMs are also adopting Nvidia’s platform for next-gen vehicle development, claimed Nvidia. They include DeepRoute, Pony.ai, Lenovo, Rimac Technology and smart. However, not every customer is embracing both hardware and software designed by Nvidia.
Nvidia said that carmakers are taking advantage of Nvidia’s flexible offerings. Nvidia’s platform is “open,” with parts of it available in “modules.” Some customers are building their own cockpit application software. Others are even developing their own autonomous driving stacks.
All said and done, Nvidia estimated that its automotive design win pipeline has increased to $14 billion over the next six years (2023 – 2028 inclusive). What matters to its automotive revenue isn’t just accumulating design wins inside vehicle models, but the support infrastructure Nvidia’s AI and Omniverse can extend to automakers.
Bottom line:
More automakers are backing off autonomous vehicles and opting for improved ADAS. In this environment, Nvidia’s automotive SoCs could prove to be overkill. Nvidia’s revised AI strategy for the auto industry appears to be cultivating an alternative path – supporting OEMs’ production and product lifecycles. It is hardly clear, however, if this will prove more lucrative than pursuing design wins for Nvidia’s automotive SoC design in every AV model.