Interesting group and workshop.
Haven't checked if posted already. Apols if has.
We're being utilised as well in the workshop as is Prophesee, Inivation and some others.
Equipment apparently being provided by Neurobus & WSU as organisers it appears....they obviously have access to Akida...wonder what else they been doing with it outside of the workshop
Topic leaders Gregory Cohen | WSU Gregor Lenz | Neurobus Co-organizers Alexandre Marcireau | WSU Giulia D’Angelo | CTU Jens Egholm | KTH
sites.google.com
SPA24: Neuromorphic systems for space applications
Topic leaders
- Gregory Cohen | WSU
- Gregor Lenz | Neurobus
Co-organizers
- Alexandre Marcireau | WSU
- Giulia D’Angelo | fortiss
- Jens Egholm | KTH
Invited Speakers
- Prof Matthew McHarg - US Air Force Academy
- Dr Paul Kirkland - University of Strathclyde
- Dr Andrew Wabnitz - DSTG
Goals
The use of neuromorphic engineering for applications in space is one of the most promising avenues to successful commercialisation of this technology. Our topic area focuses on the use of neuromorphic technologies to acquire and process space data captured from the ground and from space, and the exploration and exploitation of neuromorphic algorithms for space situational awareness and navigation. The project combines the academic expertise of Western Sydney University in Australia (Misha Mahowald Prize 2019) and industry knowledge of Neurobus, a European company specialising in neuromorphic space applications. Neuromorphic computing is a particularly good fit for this domain due to its novel sensor capabilities, low energy consumption, its potential for online adaptation, and algorithmic resilience. Successful demonstrations and projects will substantially boost the visibility of the neuromorphic community as the domain is connected to prestigious projects around satellites, off-earth rovers, and space stations. Our goal is to expose the participants to a range of challenging real-world applications and provide them with the tools and knowledge to apply their techniques where neuromorphic solutions can shine.
Projects
- Algorithms for processing space-based data: The organisers will make a real-world space dataset available thatwas recorded from the ISS, specifically for the purpose of this workshop. In addition, data can be recorded with WSU's remote-controlled observatory network. There are exciting discoveries to be made using this unexplored data, especially when combined with neuromorphic algorithmic approaches.
- Processing space-based data using neuromorphic computing hardware: Using real-world data, from both space and terrestrial sensors, we will explore algorithms for star tracking, stabilisation, feature detection, and motion compensation on neuromorphic platforms such as Loihi, SpiNNaker, BrainChip, and Dynap. Given that the organisers are involved in multiple future space missions, the outcomes of this project may have a significant impact on a future space mission (and could even be flown to space!)
- Orbital collision simulator: The growing number of satellites in the Earth’s orbit (around 25,000 large objects) makes it increasingly likely that objects will collide, despite our growing Space Situational Awareness capabilities to monitor artificial satellites. The fast and chaotic cloud of flying debris created during such collisions can threaten more satellites and is very hard to track with conventional sensors. By smashing Lego “cubesats” () in front of a Neuromorphic camera, we can emulate satellite collisions and assess the ability of the sensor to track the pieces and predict where they will land. We will bring Lego kits to the workshop. Participants will need to design a “collider”, build cubesats with legos, record data, measure the position of fallen pieces, and write algorithms to process the data.
- High-speed closed-loop tracking with neuromorphic sensors: Our motorised telescope can move at up to 50o per second and can be controlled by a simple API. The low-latency of event cameras makes it possible to dynamically control the motors using visual feedback to keep a moving object (bird, drone, plane, satellite...) in the centre of the field of view. The algorithm can be tested remotely with the Astrosite observatory (located in South Australia) or with the telescope that we will bring to the workshop.
- Navigation and landing: Prophesee’s GenX320 can be attached to a portable ST board and powered with a small battery. To simulate a landing of a probe on an extraterrestrial body, we attach the camera to an off-the-shelf drone for the exploration of ventral landing, optical flow and feature tracking scenarios, as well as predicting the distance to the ground to avoid dangerous landmarks.
- High-speed object avoidance: The goal is to work on an ultra-low-latency vision pipeline to avoid incoming objects in real-time, simulating threats in the form of orbital debris. This will involve a robotic element added to the orbital collision simulator.
Materials, Equipment, and Tutorials:
We're going to make available several pieces of equipment that includes telescopes to record the night sky, different event cameras from Prophesee and iniVation, a Phantom high-frame rate camera for comparison, neuromorphic hardware such as Brainchip's Akida and SynSense Speck. ICNS will also provide access to their Astrosite network of remote telescopes, as well as their new DeepSouth cluster.
We will run hands-on sessions on neuromorphic sensing and processing in space, building on successful tutorials from space conferences, providing code and examples for projects, and training with neuromorphic hardware. Experts in space imaging, lightning, and high-speed phenomena detection will give talks, focusing on neuromorphic hardware's potential to address current shortcomings. The workshop will feature unique data from the International Space Station, provided by WSU and USAFA, marking its first public release, allowing participants to develop new algorithms for space applications and explore neuromorphic hardware's effectiveness in processing this data for future space missions. Additionally, various data collection systems will be available, including telescope observation equipment, long-range lenses, tripods, a Phantom High-speed camera, and WSU's Astrosite system for space observations. Neurobus will make available neuromorphic hardware on site. This setup facilitates experiments, specific data collection for applications, experimenting with closed-loop neuromorphic systems, and online participation in space observation topics due to the time difference