Tothemoon24
Top 20

A Computational Event-Driven Sensor with In-Sensor Spiking Neural Network for Motion Recognition
This article introduces a transformative computational event-driven vision sensor featuring a WSe2-based photodiode. This sensor directly converts dynamic motion into programmable, sparse spiking signals, overcoming limitations of conventional frame-based sensors. The innovation allows for...
In-Sensor Spiking Neural Network Simulations: Simulations employed PyTorch v.1.7.1, creating a single-layer SNN with 128x128 input neurons and three output neurons. The dataset included 300 instances of three gestures, and weights were trained using a gradient descent strategy. The ANN model, constructed with the same number of input and output neurons, was converted to the SNN.
Conclusion
In conclusion, the study introduced a groundbreaking computational event-driven vision sensor characterized by its ability to generate adjustable current spikes exclusively in response to changes in light intensity, achieving an impressive temporal resolution of 5 μs. Leveraging WSe2photodiodes, the researchers successfully demonstrated non-volatile, programmable, and linear photoresponsivity dependent on light intensity. This distinctive feature allowed for the emulation of synaptic weights within a neural network, contributing to the sensor's versatility and adaptability.Furthermore, integrating output neurons enables the creation of an in-sensor SNN that exhibited a commendable 92% accuracy in motion recognition tasks. The direct execution of motion recognition within event-driven sensory terminals represented a significant advancement, offering the potential for real-time edge computing vision chips. This innovation holds promise for various applications, pointing towards the future development of efficient and high-performance vision systems with edge computing capabilities.