Hi Dodge, I believe you are a very accomplished comedian in a parallel universe, and its coming through in this one, jocularity is the spice of life in my opinion.Bloody typical - you couldn't trust Australians to make a tin boat let alone a submarine, and now we can't even get disparity right.
ISC West is a massive Security trade event.“Get ready to unveil our AI/ML services as they are set to revolutionize the industry. Our range of products and services can make it easy for you to realize your goals.”
View attachment 32904
Very interesting in light of BrainChip’s recent announcement.
BrainChip expands its Ecosystem with Teksun to Bring the Akida Processor to Next-Generation AIoT Devices
Now let revisit AI-generated routines soon to be in the new Mercedes.Thanks latnrollah,
At market launch of the new model series, the following options will be available:
1/Templates: With the help of these templates, customers can experiment with the basic types of support that are possible with the help of artificial intelligence. Examples here include "Cold days" (switching on the seat heating at a certain outside temperature, ambient lighting changes to warm orange) or "Date night" (romantic music is played via Bluetooth audio, ambient lighting turns pink). The templates are mostly stored in the backend. Some are stored locally in the vehicle, so they can be used even if there happens to be no connectivity. The templates are operated via the vehicle's central display and are arranged in carousel-style on the screen.
AI-generated routines[17]:
In the future, the aim is for the E-Class to fully automate recurring routine tasks if desired. Artificial intelligence (AI) will make this possible. The vehicle's ability to learn and evolve with the customer will represent a new level of intelligence. In the first stage of expansion, Mercedes-Benz will offer AI-generated routines for the driver and the seating system (ventilation, heating, massage). Other interior systems are to be successively integrated and further routines made possible.
Then what is [17]???
[17]The functions described represent visions of the future, some of which are not yet available and will depend on the respective vehicle model, the individual configuration and the particular market.
From the Mercedes Feb 22 Release, the Routines function seem like an added option with Akida. What is everyone take on this?
Learning 🏖
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.Now let revisit AI-generated routines soon to be in the new Mercedes.
View attachment 32906
"Just like a fine wine" the car's software will gets better over time.
So just think about all that's for a minute.
I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).
But remember this?
"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."
![]()
Neuromorphic Computing Will Need Partners To Break Into The Datacenter
The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and amplewww.nextplatform.com
Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.
Just my intoxicated thought (DYOR)
-------
On another note. In the Next Platform article;
"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."
Good old MosChip...
Learning 🏖![]()
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.
My opinion only DYOR
FF
AKIDA BALLISTA
This cameo by MOSCHIP comes out of left field. The 1000 Eyes has homework for the weekend.
My opinion only DYOR
FF
AKIDA BALLISTA
InterestingNow let revisit AI-generated routines soon to be in the new Mercedes.
View attachment 32906
"Just like a fine wine" the car's software will gets better over time.
So just think about all that's for a minute.
I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).
But remember this?
"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."
![]()
Neuromorphic Computing Will Need Partners To Break Into The Datacenter
The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and amplewww.nextplatform.com
Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.
Just my intoxicated thought (DYOR)
-------
On another note. In the Next Platform article;
"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."
Good old MosChip...
Learning 🏖![]()
Offering a more measured and somewhat skeptical view on autonomous driving – unprecedented for Nvidia – Ali Kani, vice president and general manager of the automotive team, called fully autonomous vehicles “the next decade’s problems.”Fully autonomous vehicles are ‘the next decade’s problems.’
Ali Kani
ChatGPT, here it comesThe impressive capabilities of generative AI have created a sense of urgency.
Jensen Huang
It has probably/certainly already been posted but I can't find it here via the search.
Brainchip Readies Generation 2 Akida AI | Gestalt IT Rundown: March 8, 2023
https://gestaltit.com/rundown/tom/b...n-2-akida-ai-gestalt-it-rundown-march-8-2023/
Hoppy, Growing old is mandatory, Growing up is optional.........."with SP going down on many companies"
Thank you for your lovely thoughts there Mr Pope and even though I'll turn 66 years old later this year if I survive that long, there still lives within me a sniggering 12 year old. Is it just me or do some of us just never grow up?![]()
Someone gave me a T-shirt with:Hoppy, Growing old is mandatory, Growing up is optional..........
Maybe we can deduce a few things with whether or not Akida IP lurks in the Renesas MCU based on the Cortex M85 with Helium:ARM adds custom instructions to M85 controller in RISC-V AI fightback
Technology News | March 13, 2023
By Nick Flaherty
RISC-V IOT AIMPUS/MCUS
ARM has developed a new version of its high end Cortex-M85 microcontroller, allowing chip designers to add custom instructions for AI applications.
This has been one of the key advantages of the competing RISC-V instruction set and signals a fight back by ARM ahead of the Embedded World (EW2023) exhibition in Nuremberg, Germany, this week.
Renesas Electronics will be showing the first live demonstrations of artificial intelligence (AI) and machine learning (ML) implementations on the previous generation M85 design.
This first demonstration showcases a people detection application developed in collaboration with UK ML startup Plumerai that identifies and tracks persons in the camera frame in varying lighting and environmental conditions.
The second demo showcases a motor control predictive maintenance use case with an AI-based unbalanced load detection application using Tensorflow Lite for Microcontrollers with CMSIS-NN.
The latest version of the M85 core, Cortex-M85 r1, includes ARM Custom Instructions (ACI) which allow designers to include custom defined data processing instructions directly on the controller. These were first proposed in 2019 in the ARMv8.1 architecture but are only now being implemented in designs, in part to address the increasing popularity of cores based on the customisable RISC-V instruction set.
“With ACI on Cortex-M85, the user does not have to turn to alternative architectures to implement a desired instruction encoding. Instead, this can now be done on CPUs that are based on the ARM architecture, with Cortex-M85 being the first high-performing microcontroller to provide this option.,” said ARM. “Through ACI, the user is given the power to innovate within the proven AEM architecture, while maintaining the ecosystem advantages of the Cortex-M CPUs.”
ACI is suitable for applications using specialized bit field processing, trigonometric functions, and image pixel manipulations and can also be applied to accelerate frequently used data processing functions. It is also being implemented on the Cortecx-M55 core.
AI demonstrations
In the meantime, Renesas is taking advantage of the Helium technology in the core which will be part of the RA (Renesas Advanced) family of MCUs.
Helium is ARM’s M-Profile Vector Extension, available as part of the Armv8.1M architecture and provides a boost for machine learning (ML) and digital signal processing (DSP) applications, accelerating applications such as endpoint AI.
“We’re proud to again lead the industry in implementing the powerful new Arm Cortex-M85 processor with Helium technology,” said Roger Wendelken, Senior Vice President in Renesas’ IoT and Infrastructure Business Unit.
“By showcasing the performance of AI on the new processor, we are highlighting technical advantages of the new platform and at the same time demonstrating Renesas’ strengths in providing solutions for emerging applications with our innovative ecosystem partners.”
“We’re excited to take part in this ground-breaking demonstration,” said Roeland Nusselder, CEO of Plumerai. “ARM’s Helium technology supported on the new RA MCUs with the Cortex-M85 core significantly accelerates the Plumerai inference engine.”
“This performance uplift will enable our customers to use larger and more accurate versions of Plumerai’s People Detection AI, add additional product features, and extend battery life,” he said. ”Our customers have an insatiable appetite for adding new and more accurate AI features that run on a microcontroller. Together with Renesas, we are the first to fulfil this demand.”
The Cortex-M85 core supports ARM TrustZone alongside a Renesas integrated cryptographic engine, immutable storage, key management and tamper protection against DPA/SPA side-channel attacks. The Armv8-M architecture also brings Pointer Authentication/Branch Target Identification (PAC/BTI) security extension, a new architectural feature that provides enhanced mitigation from software attack threats and helps achieve PSA Certified Level 2 certification.
The new RA MCUs based on the Cortex-M85 core will be supported by Renesas’ Flexible Software Package (FSP). This enables faster application development by providing all the infrastructure software needed, including multiple RTOS, BSP, peripheral drivers, middleware, connectivity, networking, and security stacks as well as reference software to build complex AI, motor control and graphics solutions.
It also allows developers to integrate their own legacy code and choice of RTOS with FSP, providing full flexibility in application development.
It’ll be interesting to know that Mercedes will let it’s drivers easily take their experienced cockpit with them when they flip their cars.Now let revisit AI-generated routines soon to be in the new Mercedes.
View attachment 32906
"Just like a fine wine" the car's software will gets better over time.
So just think about all that's for a minute.
I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).
But remember this?
"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."
![]()
Neuromorphic Computing Will Need Partners To Break Into The Datacenter
The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and amplewww.nextplatform.com
Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.
Just my intoxicated thought (DYOR)
-------
On another note. In the Next Platform article;
"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."
Good old MosChip...
Learning 🏖![]()
This also opens up the possibility of a subscription based model, where, for a fee, a driver can “own” his/her trained Akida cockpit and plug it into any other Mercedes they buy in the future…. Rather than having to retrain from scratch with every new car they buy!It’ll be interesting to know that Mercedes will let it’s drivers easily take their experienced cockpit with them when they flip their cars.
No mention of Brainchip nor Valeo... is Merc going with Luminar and ditching Valeo..?Now let revisit AI-generated routines soon to be in the new Mercedes.
View attachment 32906
"Just like a fine wine" the car's software will gets better over time.
So just think about all that's for a minute.
I do understand that's Mercedes is talking about software and update overtime to make improvements of it software to become intelligence. BUT, to run the software, hardware needs to be in the Mercedes; so we have the main CPU (Nvidia).
But remember this?
"BrainChip already is seeing some success. It’s Akida 1000 platform is being used in Mercedes-Benz’s Vision EQXX concept car for in-cabin AI, including driver and voice authentication, keyword spotting and contextual understanding."
![]()
Neuromorphic Computing Will Need Partners To Break Into The Datacenter
The emerging field of neuromorphic processing isn’t an easy one to navigate. There are major players in the field that are leveraging their size and amplewww.nextplatform.com
Brainchip's Akida technology is to make "sensor smarter". So with Mercedes AI-generated routines. Being implement in the new Mercedes. I do think AKIDA is processing driver and voice authentication, keyword spotting and contextual understanding (learning driver behaviours) at the sensor (process relevant data). Sent it to the main CPU (Nvidia) and the CPU sent it to the cloud (Data Center). Data center process the data and make improvements on the software to make the software more intelligent. Then software updates in the car. So that how it's 'gets better over time.
Just my intoxicated thought (DYOR)
-------
On another note. In the Next Platform article;
"BrainChip also is working with SiFive to integrate the Akida technology with SiFive’s RISC-V processors for edge AI computing workloads and MosChip, running its Akida IP with the vendor’s ASIC platform for smart edge devices."
Good old MosChip...
Learning 🏖![]()
The latest investor presentation of Nviso. Pure dynamite, enjoy: