Hi Fuz,Didn't realise that state space models was developed by Prof Chris Eliasmith from University of Waterloo.
The very same person that Mercedes has signed an MoU for research collaboration with.
Mercedes MoU
Now there's a bit of morphic resonance.
It was only 3 days ago that I mentioned Eliasmith:
in relation to this:
TENNs-PLEIADES: Building Temporal Kernels with Orthogonal Polynomials
Yan Ru Pei, Olivier Coenenhttps://arxiv.org/html/2405.12179v3
...
The seminal work proposing a memory encoding using orthogonal Legendre polynomials in a recurrent state-space model is the Legendre Memory Unit (LMU) [33], where Legendre polynomials (a special case of Jacobi polynomials) are used. The HiPPO formalism [11] then generalized this to other orthogonal functions including Chebyshev polynomials, Laguerre polynomials, and Fourier modes. Later, this sparked a cornucopia of works interfacing with deep state space models including S4 [12], H3 [2], and Mamba [10], achieving impressive results on a wide range of tasks from audio generation to language modeling. There are several common themes among these networks that PLEIADES differ from. First, these models typically only interface with 1D temporal data, and usually try to flatten high dimensional data into 1D data before processing [12, 37], with some exceptions [21]. Second, instead of explicitly performing finite-window temporal convolutions, a running approximation of the effects of such convolutions are performed, essentially yielding a system with infinite impulse responses where the effective polynomial structures are distorted [31, 11]. And in the more recent works, the polynomial structures are tenuously used only for initialization, but then made fully trainable. Finally, these networks mostly use an underlying depthwise structure [14] for long convolutions, which may limit the network capacity, albeit reducing the compute requirement of the network.
[33]↑Aaron Voelker, Ivana Kajić, and Chris Eliasmith.Legendre Memory Units: Continuous-time representation in recurrent neural networks.Advances in neural information processing systems, 32, 2019. [Uni of Waterloo]
Our Pleiades paper differentiates our SSM from the Legendre polynomial which Eliasmith proposed.