Prophesee has just released
an industrial event-based vision sensor
Skip to main content
Social media links IMVE
Piano.io account links
Prophesee releases industrial-grade neuromorphic sensor
Prophesee has released what it says is the first industrial event-based vision sensor in a commercially viable, industry standard package.
Event-based vision technology takes a different imaging approach to traditional frame-based sensors. Each pixel in Prophesee’s Metavision sensor only activates if it detects a change in the scene – an event – which means low power, latency and data processing requirements compared to traditional frame-based systems.
The third-generation VGA sensor is aimed at developers of cameras for industrial automation and IoT systems such as robots, inspection equipment, monitoring, and surveillance devices.
Prophesee has been
working with camera maker Imago Technologies, which has integrated Prophesee’s sensor inside its VisionCam smart camera. This Metavision sensor release opens up event-based imaging to many more camera makers and the wider industrial imaging market.
‘A lot has been said about the performance [of event-based sensors]; a lot of claims have been made, but most of this comes from academia,’ Luca Verre, co-founder and CEO of Prophesee, told
Imaging and Machine Vision Europe. ‘When we started the company we worked to prove the technology, and it took effort to industrialise it. Now it’s ready to use.’
Prophesee was formed five years ago, originally under the name of Chronocam. It has raised $68m in investment to date, and now has more than 100 employees, most of them based in Paris, France, but with offices in Grenoble, Silicon Valley, Shanghai and Tokyo. Its neuromorphic vision technology – it mimics the function of biological sight – can run at an equivalent of 10,000 images per second, with high dynamic range, and excellent power efficiency.
Each pixel is independent and asynchronous, adjusting at the pixel level according to the dynamic and the light in the scene. If part of the scene is static – like the floor in an industrial setting – no information will be recorded. Slow changes in the scene will be sampled slowly; if something fast happens the pixel will react quickly.
Because the sensor is only recording dynamic events, the data volume is relatively low. ‘The amount of data we produce is orders of magnitude lower than what you would get with a frame camera,’ Verre said.
He said that the efficiency of the sensor means high-speed counting, operating at thousands of frames per second, can be run on a mobile system-on-chip, such as a Snapdragon 845. Imago’s VisionCam product carries out image processing internally on a computer board.
Prophesee is working on a
project counting droplets being deposited very quickly on a target object. ‘We can do this at high precision and with low compute,’ Verre said.
Monitoring vibration of machines for predictive maintenance is another application Prophesee is involved in. The sensor can measure vibration frequency and amplitude in real time to give an indication of when a machine might be about to fail.
A more traditional approach would be to use an accelerometer, but an image sensor is non-intrusive and can monitor inaccessible parts of the machine or areas that get very hot.
Prophesee has also deployed its sensor with a large machine tools manufacturer to monitor laser welding. The exposure of each pixel in the Metavision sensor can be adjusted independently, giving a wide dynamic range, making it useful for imaging a bright weld spot.
The customer is looking to track the laser spot as it moves along the seam, as well as monitoring the spatter or debris produced by the weld – the size of debris particles can give an indication of the quality of the weld. The aim is for the system to provide a closed feedback loop, so that the laser parameters can be controlled in real time for a higher quality weld.
Prophesee is also exploring ways of inspecting mobile phone screens for surface damage, by vibrating the phone and measuring how light is scattered from the surface – light will reflect differently depending on whether the phone is scratched or completely flat and undamaged.
It has also shown that the sensor can be used for volume measurements by combining it with structured light.
The sensor is supported by a software development kit (SDK), a full set of drivers, the Prophesee Player tool for recording sequences and visualising data, and an online knowledge centre containing useful resources for developers.
The chip is available in a 13 x 15mm mini PBGA package. The sensor has a 640 x 480-pixel resolution with 15μm pixels in a 3/4-inch optical format. It is manufactured in a 0.18μm specialised CIS process.
Verre said that the company’s technology follows the same rules as conventional CMOS sensors in terms of geometry and optics. The difference comes at the level of processing, he added. The firm’s SDK includes turnkey solutions for counting, tracking, optical flow, 3D measurements, and vibration measurements, but it is also working with and training distributors and system integrators.
Engineering samples of the fourth generation of the sensor are already available.
Topics
Read more about:
Image sensor,
Event-based vision,
High-speed imaging,
Business,
Technology
Magazines
Registration
©2023 Europa Science Ltd