Forbes article from 4th Oct 2022, mentions a range of companies, below is the section write up about Prophesee, no mention of us but its about the event camera & Neuromorphic so incredibly relevant cause we all know we complete Prophesee
Vision is a powerful human sensory input. It enables complex tasks and processes we take for granted. With an increase in AoT™ (Autonomy of Things) in diverse applications the role of cameras, computing and machine learning in providing human-like vision and cognition is becoming significant.
www.forbes.com
Prophesee (“predicting and seeing where the action is”), based in France, uses its event-based cameras for AVs, Advanced Driver Assistance Systems (ADAS), industrial automation, consumer applications and healthcare. Founded in 2014, the
company recently closed its C round funding of $50M, with a total of $127M raised to date. Xiaomi, a leading manufacturer of mobile phones, is one of the investors. Prophesee’s goal is to emulate human vision in which the receptors in the retina react to dynamic information. The human brain focuses on processing changes in the scene (especially for driving). The basic idea is to use camera and pixel architectures that detect changes in light intensity above a threshold (an event) and provide only this data to the compute stack for further processing. The pixels work asynchronously (not framed like in regular CMOS cameras) and at much higher speeds since they do not have to integrate photons like in a conventional frame-based camera and wait for the entire frame to finish this before the readout of the data. The advantages are significant – lower data bandwidth, decision latency, storage, and power consumption. The company’s first commercial-grade VGA event-based vision sensor featured a high dynamic range (>120 dB), low power consumption (26 mW at the sensor level or 3 nW/event). An HD (High Definition) version (jointly developed with Sony), with industry-leading pixel size (< 5 μm) has also been launched.
Figure 3: High definition format event-based imaging sensor with 5 um pixel pitch, jointly developed ... [+]
PROPHESEE
These sensors form the core of the Metavision® sensing platform, which uses AI to provide smart and efficient perception for autonomy applications and is under evaluation by multiple companies in the transportation space. Apart from forward-facing perception for AVs and ADAS, Prophesee is actively engaged with customers for in-cabin monitoring of the driver for L2 and L3 applications, see Figure 4:
Figure 4: XPERI In-cabin driver monitoring based on numan-inspired neuromorphic vision
PROPHESEE
Automotive opportunities are lucrative, but the design-in cycles are long. Over the past two years, Prophesee has seen significant interest and traction in the machine vision space for industrial applications. These include high-speed counting, surface inspection and vibration monitoring.
Figure 5: High counting using event based cameras
PROPHESEE
Prophesee recently announced collaborations with leading developers of machine vision systems to exploit opportunities in industrial automation, robotics, automotive and IoT (Internet of Things). Other immediate opportunities are image blur correction for mobile phones and AR/VR applications. These use lower format sensors than those used for the longer-term ADAS/AV opportunities, consume even lower power, and operate with significantly lower latency