This huge study from multiple institutions released 4/4/24 appears to focused on Akida. Enjoy
Here is another study. It's too long, so follow the link.
"arXiv:2404.15312v1 [eess.SP] 02 Apr 2024
Realtime Person Identification via Gait Analysis
Shanmuga Venkatachalam ECE Department
Carnegie Mellon University
shanmugv@andrew.cmu.edu Harideep Nair ECE Department
Carnegie Mellon University
hpnair@andrew.cmu.edu Prabhu Vellaisamy ECE Department
Carnegie Mellon University
pvellais@andrew.cmu.edu Yongqi Zhou {@IEEEauthorhalign} Ziad Youssfi ECE Department
Carnegie Mellon University
yongqiz2@andrew.cmu.edu ECE Department
Carnegie Mellon University
zyoussfi@andrew.cmu.edu John Paul Shen ECE Department
Carnegie Mellon University
jpshen@andrew.cmu.edu
Abstract
Each person has a unique gait, i.e., walking style, that can be used as a biometric for personal identification. Recent works have demonstrated effective gait recognition using deep neural networks, however most of these works predominantly focus on classification accuracy rather than model efficiency. In order to perform gait recognition using wearable devices on the edge, it is imperative to develop highly efficient low-power models that can be deployed on to small form-factor devices such as microcontrollers. In this paper, we propose a small CNN model with 4 layers that is very amenable for edge AI deployment and realtime gait recognition. This model was trained on a public gait dataset with 20 classes augmented with data collected by the authors, aggregating to 24 classes in total. Our model achieves 96.7% accuracy and consumes only 5KB RAM with an inferencing time of 70 ms and 125mW power, while running continuous inference on Arduino Nano 33 BLE Sense. We successfully demonstrated realtime identification of the authors with the model running on Arduino, thus underscoring the efficacy and providing a proof of feasiblity for deployment in practical systems in near future.
Index Terms:
Human Gait, Biometric Identification, Inertial Sensors, Arduino, Neuromorphic Akida, Edge AI
I Introduction
In the field of biometric identification, traditional methods such as fingerprint, facial recognition dominate. However, gait analysis is fast emerging as a unique and promising approach for identifying a person. Gait, the distinctive way an individual walks, carries inherent characteristics that can be leveraged for accurate non-intrusive person identification [
1,
2]. Unlike static biometrics, such as fingerprints and facial features, gait analysis taps into the dynamic and behavioral aspects of an individual’s movement. Every person has a distinct gait, influenced by factors like anatomy, musculoskeletal structure, and personal habits. This distinctiveness makes gait analysis an intriguing and effective tool for identifying individuals in diverse settings, ranging from surveillance and security to healthcare and rehabilitation.
For this course project (Figure
1), we experimented on using light-weight convolutional neural network (CNN) models for edge-based gait detection for person identification. We perform pre-processing of the raw gait signals and model the CNN on the Edge Impulse framework. We use a popular gait dataset and further augment it with raw data collected from our team to train and test our model. We deploy our model on an Arduino Nano BLE 33 board for live inference and demonstration. We demonstrate highly accurate gait detection through our results and performed a live demonstration to show its efficacy.
Further, we also deploy on smartphone.
Finally, we convert our CNN model to its event-based spiking neural network (SNN) equivalent via Brainchip MetaTF framework and deploy the SNN to the Brainchip Akida processor [
3]. We obtain real-time power and latency measurements.
Figure 1: Overall Framework
II Methods
This section details the methods and experimental setup undertaken for this work. First, the dataset including our custom data collection procedure is described, followed by the edge inferencing pipeline, and finally our training methodology.
II-A Dataset
We use whuGAIT dataset [
4]. A total of 118 subjects participated in the data collection process. Within this group, 20 subjects gathered data over a span of two days, generating thousands of samples each. Simultaneously, 98 subjects undertook a more concise data collection, spanning one day and resulting in hundreds of samples each. Each data sample comprises both 3-axis accelerometer and 3-axis gyroscope data, all recorded at a uniform sampling rate of 50 Hz.
Figure 2: Raw data example from each of the 20 classes
...
III-C Deploy on BrainChip Akida
As a third alternative, we convert the trained CNN to an equivalent SNN using BrainChip MetaTF and assess inference metrics on remotely accessible BrainChip Akida processors (physically located in CMU’s Silicon Valley campus). As shown in Figure
15, the converted SNN mapped to BrainChip Akida processor consumes about 880 mW with an average framerate of 22.73 fps. The inference energy consumed is 45.92 mJ/frame. This is only a preliminary analysis that needs deeper investigation for power and energy optimization.
...
V Conclusion
Our work serves as a feasibility proof for deployment of very efficient yet highly effective lightweight models on to small form factor edge devices such as Arduino, smartphone, etc. to identify person using their gait. Our model trained on a standard dataset augmented with the team members’ gait data is able to achieve 96% accuracy on 24 classes, while consuming only 70 ms inferencing time, 5 KB RAM, and 125 mW power. Further, our work serves as a first step towards deploying a gait recognition model on a neuromorphic device such as BrainChip Akida. Future investigation will focus heavily on optimizing the model, adding data for physically challenged as well as multiple other people for diversification and bias reduction, and optimizing the neuromorphic model."
https://arxiv.org/html/2404.15312v1
___________
This use case is absolutely new to me. Exciting stuff!