IndepthDiver
Regular
Hi Stuart,On Saturday, they are creating a Commercial downloadable version of the whole Chat Brain of Open AI. They had to port some stuff to Apache. It is optimized for the M1 chip, but runs on everything. Any kid with a laptop can grab this entire stack. He explains it all and documents it.
In 10 minutes, mega people can download and run this thing. Next is text to video.
I thought this information was absolutely informative.![]()
I think this is a great find, though I don't think many others on here have understand it's significance.
I just got around to watching the video you linked. As you say, they have essentially trained a chatbot similar to ChatGPT which they call GPT4All. They used Meta's Large Language model (similar to GPT3) available for research purposes, then trained it using ChatGPT prompts. End of story, they created a smaller chatbot model that can run on someone's laptop and which randomly enough is a 4-bit model.
Now the limitation with using Meta's model is that it can't legally be used for commercial purposes. So in a few days time they'll be releasing another small model that is using a large Open Source dataset, which can be used for commercial purposes.
Unlike the big companies, they are releasing these models for everyone to use. Not only that, but they're releasing the instructions so that anyone can create their own models that can be used for commercial purposes.
This is just the start of almost anyone being able to make their own Chatbot models for business opportunities. And as the guy was saying on the video, the Chatbot they produce could potentially be placed on something small like a Raspberry Pi. Others in the machine learning community will eventually improve on these models and make them smaller and better. So hobbyists and the like will now have their own way of making their own versions of this.
There's a good chance that big companies are already talking to Brainchip about using Akida for these purposes. But what this open sourcing of chatbots will do is enable more custom and innovative AI applications that will continue to grow. Since it can be done for less than $1000, a lot of people will be able to create their own chatbots, it will no longer be limited to just the companies with millions of dollars.
If you were going to run a small chatbot on something like a Raspberry Pi you'd want it to have an efficient, high performance machine learning chip ideally so it wouldn't run too slow. This is where I think Akida is in the right place at the right time.
Pure speculation, DYOR