What I loved was this reply to this article on LinkedIn
View attachment 91013
Hi Quiltman,
A. "proprietary quantization" = "awesome results - dramatic improvements in memory movement"
B. "Pending patents on tenns based super fast RAG/LLM at the edge"
C. "next gen TENNs in the pipeline.
That is the most tightly packed post I've ever seen!
You could write a thesis on each of them, and, on the commercial potential, each one is a revenue multiplier.
On a different track, here is a recently published BRN patent application for GenAI recurrence:
US2025209313A1 METHOD AND SYSTEM FOR IMPLEMENTING ENCODER PROJECTION IN NEURAL NETWORKS 20231222 pub 250626
neural network system that includes a memory and a processor. The memory is configured to store a plurality of storage buffers corresponding to a current neural network layer, and implement a neural network that includes a plurality of neurons for the current neural network layer and a corresponding group among a plurality of groups of basis function values. The processor is configured to receive an input data sequence into the first plurality of storage buffers over a first time sequence and project the input data sequence on a corresponding basis function values by performing, for each connection of a corresponding neuron, a dot product of the first input data sequence within a corresponding storage buffer with the corresponding basis function values and thereby determine a corresponding potential value for the corresponding neurons. Thus, utilizing the corresponding potential values, the processor generates a plurality of encoded output responses.