Slymeat
Move on, nothing to see.
NVIDIA have sold the sledgehammer approach to AI very well (I call it AI by GRUNT) and have been rewarded fabulously for it—a hangover from crypto mining maybe. And while excessive energy consumption is still not an issue for data centres, and many AI systems, who simply add extra green (and perceived to be free) energy generators (wind, solar etc), I doubt NVIDIA will need to consider energy efficiency. They sell the perception of AI by GRUNT and want everyone to think that’s the only way to do it.Seems Nvidia are still determined to ignore neuromorphic....continued mention of air and water cooled......................
I call it a perception of AI, because throwing immense resources at a problem in order to do something clever, very fast is not my idea of AI.
The world doesn’t yet comprehend what intelligence is, let alone artificial intelligence, nor AI at the edge. Power consumption at the edge is critical—and this is where Akida will rule.
I view intelligence as reasoning to an acceptably correct result given completely new information, and even with information missing.
The closest thing I’ve ever seen to true AI is the Tiger demonstration by BrainChip. Being trained with a single toy tiger, oriented in one direction, maybe two, and being able to reason that a picture of a real Tiger is also a Tiger. Now that IS AI and should be shouted out to the world as load as possible.
One-shot, and even two-shot learning is critical to getting people to accept the true worth of Akida.
AI by Grunt systems need to be trained with thousands of different pictures and then can only chose a solution if presented with an image that is contained within that training set, or so close that it is pretty much indistinguishable. That is simply a database lookup and has zero intelligence. It can only use logic that is coded into it. It cannot deviate from that code!
MetaAI and ChatGP have astounded me with the relevance of their answers. But still, not AI IMHO.