Quick question. The original roadmap shows Akida 2000 as being "an optimised version of AKD 1500 for LSTM and transformers". We now know that Akida 2000 includes the transformer part, but what happened to the LSTM part? Or have LSTM's been replaced with TENN's? I tried to google information on TNN's but kept getting articles about tennis, which was entertaining but not particularly helpful. But then I discovered something that
@TechGirl posted from Carnegie Mellon University which describes TNN's as follows:
Processor Architecture: Temporal Neural Networks (TNN)
Temporal Neural Networks (TNNs) are a special class of spiking neural networks, for implementing a class of functions based on the space time algebra. By exploiting time as a computing resource, TNNs are capable of performing sensory processing with very low system complexity and very high energy efficiency as compared to conventional ANNs & DNNs. Furthermore, one key feature of TNNs involves using spike timing dependent plasticity (STDP) to achieve a form of machine learning that is unsupervised, continuous, and emergent.
View attachment 31412