stuart888
Regular
LSTM is a form of recurrent neural network which can learn order dependence in sequence prediction problems. It finds applications in machine translation and speech recognition.This from August 21
Looks like AKD500 will be next
What does the road ahead look like for Akida?
AKD1000 is only the beginning of what the company expects to be a robust product portfolio (Figure 33): – AKD500: BRN is in the planning stages for AKD500, which is a low-cost version of the AKD1000, for the consumer products market. – AKD1500: BRN is also in the process of developing and prototyping AKD1500, which will have additional features to execute Long Short-Term Memory (LSTM) and Transformer networks. LSTM is a form of recurrent neural network which can learn order dependence in sequence prediction problems. It finds applications in machine translation and speech recognition. Transformer networks, on the other hand, find their applications in natural language processing (NLP) due to their ability to resolve the vanishing gradient problem – the time for which the memory is retained by the network, which is critical in order to process lengthy texts. – AKD2000: It is the optimised version of AKD1500 and the company has already started work on a prototype at its lab in Perth.
View attachment 28261
We know that adding LSTM functionality is in the works. Very comforting, as the use-cases skyrockets when dealing with more complex Sequence Prediction Problems.
I would think that whatever comes next is also focused on helping Mercedes/Nasa/Renesas/Megachips too. Since LSTM works well with Speech Recognition, certainly Mercedes phase II might be part of their focus.
My humble thought, they would want to add benefits to as many current/future customers as possible. And projects that customers want to start sooner prioritized, rather than customers with later timeframes.
What is a Sequence Prediction problem?
The sequence prediction problem consists of finding the next element of an ordered sequence by only looking at the sequence’s items.This problem covers a lot of applications such as product recommendation, forecasting, web page prefetching, speech recognition, and machine translation.
LSTM networks are indeed an improvement over RNNs as they can achieve whatever RNNs might achieve with much better finesse. LSTMs do provide better results and are truly a big step in Deep Learning. With more such, you can expect to get more accurate predictions and have a better understanding of what choices to make.
Over the past year, the pace of improvements on all fronts is clear. From the website, benchmarks, partners, and IP expansion.
Very happy here, with all the progress team Brainchip is making!