View attachment 47222
[BREAKING] Today, we proudly unveil the GenX320 Metavision® sensor - the smallest
and most power-efficient event-based vision sensor in the world!
https://bit.ly/3QgQoRY
Built for the next generation of smart consumer devices, GenX320 delivers new levels of intelligence, autonomy, and privacy to a vast range of fast-growing Edge market segments, including AR/VR headsets, wearables, smart camera and monitoring systems, IoT devices, and many more.
The sensor has been developed with a specific focus on the unique energy, compute, and size requirements of Edge AI vision systems. It enables robust, high-speed vision at ultra-low power, even in challenging operating and lighting conditions.
GenX320 key benefits include:
Ultra-fast event timestamping (1 µsec) with flexible data formatting
Smart power management for ultra-low power consumption and wake-on-events (as low as 36µW)
Seamless integration with standard SoCs, reducing external processing
Low-latency connectivity through MIPI or CPI data interfaces
AI-ready with on-chip histogram output for AI accelerators
Sensor-level privacy due to inherently sparse event data and static scene removal
Compatibility with Prophesee Metavision Intelligence software suite
Learn more about how the GenX320 successfully overcomes current vision sensing limitations in edge applications
https://bit.ly/3QgQoRY
Event-Based Metavision® Sensor GENX320 | PROPHESEE
Event-Based Metavision® Sensor GENX320 | PROPHESEE
Prophesee launches GENX320, the world’s smallest and most power-efficient event-based vision sensor, bringing more intelligence, privacy and safety than ever to consumer Edge-AI devices.
The link leads to Prophesee's early Adopters:
Zinn Labs,
ultraleap,
Xperi.
Zinn patent application for eye tracking glasses:
WO2023081297A1 EYE TRACKING SYSTEM FOR DETERMINING USER ACTIVITY
Embodiments relate to an eye tracking system. A headset of the system includes an eye tracking sensor that captures eye tracking data indicating positions and movements of a user's eye. A controller (e.g., in the headset) of the tracking system analyzes eye tracking data from the sensors to determine eye tracking feature values of the eye during a time period. The controller determines an activity of the user during the time period based on the eye tracking feature values. The controller updates an activity history of the user with the determined activity.
A method comprising: analyzing eye tracking data to determine eye tracking feature values of an eye of a user of a headset during a time period, wherein the eye tracking data is determined from an eye tracking system on the headset; determining an activity of the user during the time period based on the determined eye tracking feature values; and updating an activity history of the user with the determined activity, wherein the feature values include movements of the eye, and determining the activity comprises identifying movements of the eye that correspond to the activity.
In some embodiments, a machine learned model of the activity module 310 is a recurrent neural network (e.g., using a long short-term memory neural network or gated recurrent units) that considers the time-based component of the eye tracking feature values.