Hi FF,
Personally I disagree with all the comments said here about this. The word inference has different meanings depending on which industry you are referring to. In the context of machine learning, inference refers to the data being fed into the machine learning model or algorithm (see link below).
So in the context here, processing inference simply refers to processing sensor data. Nothing further to read here, definitely no reference to LSTM's.
On top of this, since English isn't Anil's primary language, I think some leniency may be required, so there's not much point reading too deeply into how his sentences are structured.
Pure speculation, DYOR
cloud.google.com
Machine learning inference is the process of running data points into a machine learning model to calculate an output such as a single numerical score. This process is also referred to as “operationalizing a machine learning model” or “putting a machine learning model into production.”