Thank you Dio for providing us with some context and an explanation from your understanding, in plain english.
Indeed, the processing step up from the recognition of a predetermined wake word to the complexity of NLP in one generation is astounding to me.
And beyond the building blocks of dictionaries and thesauri surely any system will require access and application rules of syntax and grammar in order to produce anything more than a parroting of language?
Is the mere application of
rules enough to provide an adequate simulacrum of consciousness?
Although on reflection that is somewhat the situation now with Chat GPT isn't it.
It has been refined and trained enough though to furnish a usable tool.
So are you saying that some version of all this evaluation, processing and reconciliation (if provisioned with access to a sufficiently worthy model library held separately in memory) could be carried out in Akida 2000 hardware rather than the current software emulations?
I would imagine that initially it would be bound within the confines of specific use cases such as the teaching of/ or the translation of a specific language perhaps or some other definable subject such as biology.
Perhaps I am getting carried away with the immediate potentialities of the tech as is my wont.
I suffer from a somewhat retrofuturistic syndrome bought about by a far too liberal dose of the Jetsons, Lost in Space and Star Trek in my television irradiated youth.

Anyway, Thank You again for all your continuing valuable contributions here.
Well done.