I know it's an Inivation camera and from memory SynSense involved with them.
Haven't dug too deep yet on the Thor project of which some recent details are below.
I did find that what appears to be the same camera is available with the details end of the post. Found it interesting that's it's designed to connect to neuromorphic hardware such as FPGA or Loihi.
Love to see how we'd go connected to it if not attempted already.
Edit. Forgot to add that we already know TCS also been working with Akida for weather / clouds via cubesats from memory.
HERE
U23-15873, updated on 31 May 2023
EGU General Assembly 2023
© Author(s) 2023. This work is distributed under the Creative Commons Attribution 4.0 License.
THOR-DAVIS: A neuromorphic camera to observe thunderstorms from inboard ISS.
Olivier Chanrion, Nicolas Pedersen, Andreas Stokholm, Benjamin Hauptmann, and Torsten Neubert
Technical University of Denmark, DTU Space, Denmark (
olivier.chanrion@gmail.com)
The technical purpose of THOR-DAVIS is to test a new camera concept in space for observations of thunderclouds and their electrical activity at up to a resolution of 10 µs. The scientific purpose is to
conduct video camera observations of thunderclouds and their electrical activity.
The focus is on altitude-resolved measurements of activity at the top of the clouds and the stratosphere above.
The camera type is a so-called neuromorphic camera (or event camera) where pixels are read out asynchronously when the pixel illumination changes. The goal is to understand, under realistic
conditions, the use of such a camera for future use in space for observations of processes in severe electrical storms. The camera has a high temporal resolution 100.000 equivalent frame per
second and a huge dynamic range of about 120 dB and is particularly well suited for this kind of observations. The camera weights about 200g and consumes about 1.5A in operation and is
particularly well suited for space applications.
In this presentation we will give the status of the development of the THOR-DAVIS experiment to be conducted by the Danish astronaut Andreas Mogensen during his upcoming ESA mission
Huggin onboard the International Space Station (ISS). We’ll present the design of the payload based on a
Davis 346 neuromorphic camera mounted on top of a Nikon D5 camera for handheld
operation. The 2 cameras are controlled by an AstroPi unit based on a Raspberry Pi computer board.
Finally, we’ll give preliminary results of laboratory measurements made with the flight model.
Our overseas product procurement service Unipos handles the direct vision sensor series DAVIS / DVXplorer C from iniVation, Switzerland [...see next]
www.tegakari.net
[New product] DAVIS346 AER
A model equipped with a connector for the AER (Address Event Representation) protocol in the popular DAVIS346.
Designed specifically for advanced neuromorphic hardware research, it allows direct interface to custom neuromorphic hardware such as FPGAs and Intel Loihi.
DAVIS346 AER Features
- 180nm CIS technology DAVIS sensor prototype
- Simultaneous output of QVGA or higher (1 x 346) resolution events and frames from a single sensor (via USB)
- Event-only output via AER connector
- Event output provides up to 120dB dynamic range, <1us latency, 1us time resolution, and up to 1200 million events/second throughput
- 6-axis IMU, up to 8kHz sampling rate
- Less than 5mA power consumption with 180V supply
- Anodized aluminum case with CS lens mount, 4-sided mounting options, screw-lock USB port
About the AER protocol
The AER protocol used a variable number of lines (buses) to transmit data and two lines (REQ and ACK) to synchronize the data at the sender and receiver asynchronously with a four-phase handshake. A simple protocol (also known as an asynchronous protocol).
DAVIS346 AER User Guide/Datasheet
You can download the user guide (PDF) for DAVIS346 AER from the manufacturer page below.