Hi Dhm,@Steve10 when Qualcomm revealed their Snapdragon with Prophesee at CES, the blur free photography contribution was not from Brainchip even though many of us thought it was. Therefore it must have been Synsense. Round one to Synsense although there are many rounds to play out over coming years.
Synsense may have beaten us to the punch, but we have the knockout punch.
SynSense boasts a response time of 5ms (200 fps).
Prophesee is capable of 10k fps equivalent.
nViso achieved >1000 fps with Akida.
DYNAP-CNN — the World’s First 1M Neuron, Event-Driven Neuromorphic AI Processor for Vision Processing | SynSense

DYNAP-CNN — the World’s First 1M Neuron, Event-Driven Neuromorphic AI Processor for Vision Processing | SynSense
Today we are announcing our new fully-asynchronous event-driven neuromorphic AI processor for ultra-low power, always-on, real-time applications. DYNAP-CNN opens brand-new possibilities for dynamic vision processing, bringing event-based vision applications to power-constrained devices for the...

Computation in DYNAP-CNN is triggered directly by changes in the visual scene, without using a high-speed clock. Moving objects give rise to sequences of events, which are processed immediately by the processor. Since there is no notion of frames, DYNAP-CNN’s continuous computation enables ultra-low-latency of below 5ms. This represents at least a 10x improvement from the current deep learning solutions available in the market for real-time vision processing.
SynSense and Prophesee develop one-chip event-based smart sensing solution
SynSense and Prophesee develop one-chip event-based smart sensing solution
Partnership leverages respective leadership in neuromorphic computing and sensing to realize on-sensor processing integration leading to small form-factor, low-power and cost-effective IoT solutions
Oct 15, 2021 – SynSense and Prophesee, two leading neuromorphic technology companies, today announced a partnership that will see the two companies leverage their respective expertise in sensing and processing to develop ultra-low-power solutions for implementing intelligence on the edge for event-based vision applications.
… The sensors facilitate machine vision by recording changes in the scene rather than recording the entire scene at regular intervals. Specific advantages over frame-based approaches include better low light response (<1lux) and dynamic range (>120dB), reduced data generation (10x-1000x less than conventional approaches) leading to lower transfer/processing requirements, and higher temporal resolution (microsecond time resolution, i.e. >10k images per second time resolution equivalent).
BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency - BrainChip

BrainChip Partners with Prophesee Optimizing Computer Vision AI Performance and Efficiency
BrainChip and Prophesee partner to optimize computer vision AI, delivering enhanced performance and efficiency.

Laguna Hills, Calif. – June 19, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world’s most advanced neuromorphic vision systems, today announced a technology partnership that delivers next-generation platforms for OEMs looking to integrate event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.
… “By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings,” said Luca Verre, CEO and co-founder of Prophesee.
NVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP | NVISO
NVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP achieving ultra-low latency (<1ms) processing at the edge
NVISO has successfully completed full interoperability of four of its AI Apps from its Human Behavioural AI catalogue to the BrainChip Akida neuromorphic processor achieving a blazing average model throughput more than 1000 FPS and average model storage of less than 140 KB.
NVISO announces it has reached a key interoperability milestone with BrainChip Akida Neuromorphic IP
18th July, 2022
NVISO has successfully completed full interoperability of four of its AI Apps from its Human Behavioural AI catalogue to the BrainChip Akida neuromorphic processor achieving a blazingly fast average model throughput of more than 1000 FPS and average model storage of less than 140 KB.
The relevance of nViso is that it uses a frame camera, yet it has a latency of less than 1 ms:
NVISO’s latest Neuro SDK to be demonstrated running on the BrainChip Akida fully digital neuromorphic processor platform at CES 2023 | NVISO
https://www.nviso.ai/en/news/nviso-...l-neuromorphic-processor-platform-at-ces-2023
2nd January, 2023
NVISO’s latest Neuro SDK to be demonstrated running on the BrainChip Akida fully digital neuromorphic processor platform at CES 2023
Benchmark performance data for the AI Apps running on the BrainChip neuromorphic AI processor