One of the things I've been wondering about is whether the adoption of ChatGPT by Mercedes will preclude the use of "Hey Mercedes!" function of EQXX fame.
https://media.mercedes-benz.com/article/323212b5-1b56-458a-9324-20b25cc176cb
Customers can participate via the Mercedes me app or directly from the vehicle using the voice command “Hey Mercedes, I want to join the beta programme”[2]. The rollout of the beta programme will happen over the air. Mercedes-Benz is integrating ChatGPT through Azure OpenAI Service, leveraging the enterprise-grade capabilities of Microsoft’s cloud and AI platform.
...
ChatGPT complements the existing intuitive voice control via Hey Mercedes. While most voice assistants are limited to predefined tasks and responses, ChatGPT leverages a large language model to greatly improve natural language understanding and expand the topics to which it can respond.
From the original Jan 2022 EQXX report:
https://media.mercedes-benz.com/article/d31bf12a-a2d4-4978-b176-17c6a0fea6dc
Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.
Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.
,,,
The road trip sidekick in the VISION EQXX is also fun to talk to. The further development of the “Hey Mercedes” voice assistant is emotional and expressive thanks to a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.
... but really, nothing to see here ...
https://media.mercedes-benz.com/article/323212b5-1b56-458a-9324-20b25cc176cb
- MBUX beta programme starts June 16, 2023
Customers can participate via the Mercedes me app or directly from the vehicle using the voice command “Hey Mercedes, I want to join the beta programme”[2]. The rollout of the beta programme will happen over the air. Mercedes-Benz is integrating ChatGPT through Azure OpenAI Service, leveraging the enterprise-grade capabilities of Microsoft’s cloud and AI platform.
...
ChatGPT complements the existing intuitive voice control via Hey Mercedes. While most voice assistants are limited to predefined tasks and responses, ChatGPT leverages a large language model to greatly improve natural language understanding and expand the topics to which it can respond.
From the original Jan 2022 EQXX report:
https://media.mercedes-benz.com/article/d31bf12a-a2d4-4978-b176-17c6a0fea6dc
Neuromorphic computing – a car that thinks like you
Another key efficiency feature of the VISION EQXX that takes its cue from nature is the way it thinks. It uses an innovative form of information processing called neuromorphic computing. The hardware runs so-called spiking neural networks. Information is coded in discrete spikes and energy is only consumed when a spike occurs, which reduces energy consumption by orders of magnitude.Working with California-based artificial intelligence experts BrainChip, Mercedes-Benz engineers developed systems based on BrainChip’s Akida hardware and software. The example in the VISION EQXX is the “Hey Mercedes” hot-word detection. Structured along neuromorphic principles, it is five to ten times more efficient than conventional voice control.
Although neuromorphic computing is still in its infancy, systems like these will be available on the market in just a few years. When applied on scale throughout a vehicle, they have the potential to radically reduce the energy needed to run the latest AI technologies.
,,,
The road trip sidekick in the VISION EQXX is also fun to talk to. The further development of the “Hey Mercedes” voice assistant is emotional and expressive thanks to a collaboration between Mercedes-Benz engineers and the voice synthesis experts from Sonantic. With the help of machine learning, the team have given “Hey Mercedes” its own distinctive character and personality. As well as sounding impressively real, the emotional expression places the conversation between driver and car on a whole new level that is more natural and intuitive, underscoring the progressive feel of the modern luxury conveyed by the UI/UX in the VISION EQXX.
... but really, nothing to see here ...