Interesting thought. I wonder if the AKIDA can be taught different languages like Chinese, Japanese, Arabic, Spanish etc? As you don't really chat to a car, there will only be a few instructions and will not need a lot of memory to handle that.
That is the monumental advantage of AKIDA personalisation through on chip learning.
Everyone has an individual voiceprint which includes their accent, dialect, educational attainment, native language, even speech impediments and the so called natural language deep learning systems need to have been pre programmed with huge data sets to have any chance of responding when a user issues a command.
I posted a paper in the last several weeks which was a Google funded research effort that discussed this problem in particular for people with disabilities and calling for samples of speech from people with speech impediments.
Years ago now the NSW Police introduced a speech recognition telephone call centre and it took me a week to work out the way to get it to connect me with Parramatta Police Station was to say Parramadda.
The on chip learning creating personalisation means that for critical functions like voice ID giving you access to your car, house, business or palace even if you have a speech impediment and speak like Prince Charles you can train AKIDA with one or several shots to recognise your commands. The language you speak is completely irrelevant.
This is one of the multitude of unmatched reasons why this extraordinary technology will become ubiquitous in all the World.
My opinion only DYOR
FF
AKIDA BALLISTA