Hi IDD,Hi FF,
Personally I disagree with all the comments said here about this. The word inference has different meanings depending on which industry you are referring to. In the context of machine learning, inference refers to the data being fed into the machine learning model or algorithm (see link below).
So in the context here, processing inference simply refers to processing sensor data. Nothing further to read here, definitely no reference to LSTM's.
On top of this, since English isn't Anil's primary language, I think some leniency may be required, so there's not much point reading too deeply into how his sentences are structured.
Pure speculation, DYOR
Model inference overview | BigQuery | Google Cloud
cloud.google.com
Machine learning inference is the process of running data points into a machine learning model to calculate an output such as a single numerical score. This process is also referred to as “operationalizing a machine learning model” or “putting a machine learning model into production.”
You've left out the outference "to calculate an output".
The inference is the estimating of the output from the input data.
infer
[ɪnˈfəː]
VERB
- deduce or conclude (something) from evidence and reasoning rather than from explicit statements:
Inference, from my understanding, is the classification of an object from data which is similar to, but not identical with already known data.
Akida has been processing inference for donkeyears, as exemplified by the use of the word 11 times in this 2019 presentation.
https://brainchip.com/wp-content/uploads/2019/10/BrainChip-Linley-Akida-Presentation_v5.pdf
For example, Akida may determine the classification of an object if it has a sufficient percentage of similarities to a known object, or if a part of the new object is sufficiently similar to a corresponding part of a known object.