Steve Brightfield provided an insight in what Brainchip is doing with Ear Buds/Hearing Aids.
" We're using neuromorphics for the input signaling because of the advantages of this far state, but when we get to some of the large language models, we don't use the neuromorphic algorithms, we use the state space models. And we're looking at combining those two into a single, you know, platform so that you get the enhancement of the signal going into the speech recognition, the speech recognition with the LM can predict what next word is being said and improve the accuracy of the recognition. And then you can have a local, large language model in your earpiece that could have maybe a very limited set of information in it, but it's what you need, right? One of the interesting use cases is a memory LLM for an old person. And you know, it would say, "Oh, your granddaughter's name is Shelley and she's four years old." And so that when you can, if you don't forget this stuff, boom, you have it, right? And you just need to some cues some time to get your memory back. And this is one of those interesting things that we're like the Nationalist to Health says, "This could be really good because that helps solve a lot of these issues with, if you can't hear well, you start having dementia and you start having, you know, problems, cognition problems, right? It's very important to have, you know, hearing insight to keep yourself, your brain healthy because your brain, that's what it's doing. It's constantly processing those signals. "