Slymeat
Move on, nothing to see.
I also thought this tweet looked promising when I first saw it, but in the referenced article they state that transformers are superior to LSTM. And they go on to give an example of visual processing where the parallel processing of distant pixels, gives transformers another advantage.Hi Chippers, in the context of the confirmation that a new chip(Akida2000) will be taped-out this year and FF’s highlighting that the known new features will include LSTM(long short term memory) and transistors please access the article mentioned in this tweet by Edge Impulse last night. It highlights the direction neural networks are going to include both LSTM and transistors which increase speed and efficiency compared to existing CNN’s. The article is written by an employee of Synopsis but I’ve taken as a good example of how we remain a step ahead of the competition. View attachment 26751
I did not see that article as supporting BrainChip’s LSTM aspirations.
From the referenced article:
”Unlike RNNs and LSTMs that must read a string of text sequentially, transformers are significantly more parallelizable and can read in a complete sequence of words at once, allowing them to better learn contextual relationships between words in a text string.”
and