The penny' dropped ...
Tony reportedly said that originally TENNs couldn't do recurrence.
So that meant that TENNs needed to be tied to the original Akida 1 NPU. - "Better with Akida".
But Tony also intimated that TENNs was sufficiently evolved to do recurrence by the time GenAI/Akida 3 were developed. He has also indicated that there is some (minimal) CPU involvement in the operation of TENNs, as distinct from AKida NPUs which do not require processor involvement in detection/inference/classification.
It was only after the announcement of GenAI/Akida 3 that we saw Akida 1 being spoken of only in terms of MACs, with no mention of the old NPUs.
So, when TENNs learned recurrence, was this the tipping point which meant that Akida 1 had been superseded?
I wonder what hardware improvements were involved in implementing recurrence with TENNs. I imagine that memory would be involved. In the Roadmap, JT mentioned that patents for improvements in memory operation were in the pipeline.
This is the patent application implementing recurrence:
US2025209313A1 METHOD AND SYSTEM FOR IMPLEMENTING ENCODER PROJECTION IN NEURAL NETWORKS 20231222 pub 250625
[0027] FIG. 10 is a flow chart of a method performed by the neural processor for performing, in the recurrent mode, encoder projection, mid-layer processing, and decoding which generates content using one or more neural network layers of the neural network in accordance with some embodiments.
The patent was filed in December 2023. Edit: The original TENNs patent was filed in mid-2022:
WO2023250093A1 METHOD AND SYSTEM FOR IMPLEMENTING TEMPORAL CONVOLUTION IN SPATIOTEMPORAL NEURAL NETWORKS 20220622
... so the recurrent feature was the result of 18 months further development.
The patent application would have preceded the final circuit design. It may have been breadboarded in FPGA before th4e ASIC design was done for the commercial IP release.
Tony reportedly said that originally TENNs couldn't do recurrence.
So that meant that TENNs needed to be tied to the original Akida 1 NPU. - "Better with Akida".
But Tony also intimated that TENNs was sufficiently evolved to do recurrence by the time GenAI/Akida 3 were developed. He has also indicated that there is some (minimal) CPU involvement in the operation of TENNs, as distinct from AKida NPUs which do not require processor involvement in detection/inference/classification.
It was only after the announcement of GenAI/Akida 3 that we saw Akida 1 being spoken of only in terms of MACs, with no mention of the old NPUs.
So, when TENNs learned recurrence, was this the tipping point which meant that Akida 1 had been superseded?
I wonder what hardware improvements were involved in implementing recurrence with TENNs. I imagine that memory would be involved. In the Roadmap, JT mentioned that patents for improvements in memory operation were in the pipeline.
This is the patent application implementing recurrence:
US2025209313A1 METHOD AND SYSTEM FOR IMPLEMENTING ENCODER PROJECTION IN NEURAL NETWORKS 20231222 pub 250625
[0027] FIG. 10 is a flow chart of a method performed by the neural processor for performing, in the recurrent mode, encoder projection, mid-layer processing, and decoding which generates content using one or more neural network layers of the neural network in accordance with some embodiments.
The patent was filed in December 2023. Edit: The original TENNs patent was filed in mid-2022:
WO2023250093A1 METHOD AND SYSTEM FOR IMPLEMENTING TEMPORAL CONVOLUTION IN SPATIOTEMPORAL NEURAL NETWORKS 20220622
... so the recurrent feature was the result of 18 months further development.
The patent application would have preceded the final circuit design. It may have been breadboarded in FPGA before th4e ASIC design was done for the commercial IP release.
Last edited: