This paper provides a good summary of the state of the art in SNN and the challenges it faces. One of the biggest limitations with Akida 1.0/1000 I can see is that it doesn’t let you train natively on SNN and its limited to CNN/ANN to SNN conversion. I can see why they went that route but you are going to be limited by the upper bounds of the Original ANN performance you are converting from and still has all the costs involved in training on CNN, apparently. It’s also not good for the academic community researching how to improve the learning algorithms and reduce simulation times on SNN. To be a massive market success my sense is we have to provide a better solution than existing approaches- not just from a power consumption perspective. It has to be ease, speed and cost of training, implementation, accuracy, latency, power consumption etc etc. Akida 2.0 supposedly offers native training in SNN, so hoping it’s a success and better suited to the people researching SNN who are advancing the state of the art.
I have very little technical competence, so putting this here to see if my observations are right?
www.mdpi.com
I have very little technical competence, so putting this here to see if my observations are right?

Overview of Spiking Neural Network Learning Approaches and Their Computational Complexities
Spiking neural networks (SNNs) are subjects of a topic that is gaining more and more interest nowadays. They more closely resemble actual neural networks in the brain than their second-generation counterparts, artificial neural networks (ANNs). SNNs have the potential to be more energy efficient...