uiux
Regular
https://www.navalnews.com/naval-new...cy-new-concept-to-counter-hypersonic-threats/
U.S. Missile Defense Agency New Concept to Counter Hypersonic Threats
The U.S. Missile Defense Agency is developing a concept to protect the US, its deployed forces, and allies against regional hypersonic threats using a multi- layered solution to defend against the next generation of hypersonic glide vehicles.
“The department is engaging and working with our allies and partners to enhance our collective missile defense efforts.” - Leonor Tomero – Deputy Assistant Secretary of Defense for Nuclear and Missile Defense Policy in the Office of the Secretary of Defense.
She said it in mentioning Japan, South Korea, Australia and our NATO allies, along with Israel and Gulf Cooperation Council nations.
https://patentscope.wipo.int/search/en/detail.jsf?docId=US321528812
US20210105421 - NEUROMORPHIC VISION WITH FRAME-RATE IMAGING FOR TARGET DETECTION AND TRACKING
Abstract
An imaging system and a method of imaging are provided. The imaging system includes a single optics module configured for focusing light reflected or emanated from a dynamic scene in the infrared spectrum and a synchronous focal plane array for receiving the focused light and acquiring infrared images having a high spatial resolution and a low temporal resolution from the received focused light. The imaging system further includes an asynchronous neuromorphic vision system configured for receiving the focused light and acquiring neuromorphic event data having a high temporal resolution, and a read-out integrated circuit (ROIC) configured to readout both the infrared images and event data
https://nationalinterest.org/blog/b...estroy-unstoppable-hypersonic-missiles-192388
There May Be a Way to Destroy "Unstoppable" Hypersonic Missiles
There may actually be a way to destroy "unstoppable hypersonic" missiles. As a matter of fact, there may be several emerging ways to track and destroy hypersonic weapons. But are they realistic, knowing the speed, maneuverability, and potential destruction associated with hypersonics?
Responding to the seriousness of the existing Russian and Chinese hypersonic threat, the U.S. Missile Defense Agency has presented a challenge to the industry to develop a multi-layered defensive concept.
https://www.saffm.hq.af.mil/Portals...TE_/FY22 DAF J-Book - 3620 - SF RDT and E.pdf
Department of Defense
Fiscal Year (FY) 2022 Budget Estimates
624846: Spacecraft Payload Technologies
Title: Space-Based Detector Technologies
Description: Develop advanced infrared device technologies that enable hardened space detector arrays with improved detection to perform acquisition, tracking, and discrimination of space objects and missile warning.
FY 2021 Plans:
Begin design, development, and assessment of low-cost, high-volume infrared detectors and focal plane arrays for proliferated space architecture layers. Begin development of focal plane array optical data outputs for higher speed and data throughput and begin radiation tolerance characterization of photonic devices. Begin development of alternative infrared focal plane array materials and device architectures. Continue development of resilient scanning and staring digital focal plane arrays. Complete development of 8192 x 8192 pixels, 10 micron pixel pitch focal plane arrays hardened to the natural space environment and focused photons to enable whole-earth staring for Launch Detection and Missile Warning missions.
FY 2022 Plans:
Continue design, development, and assessment of low-cost, high-volume infrared detectors and focal plane arrays for proliferated space architecture layers. Continue development of focal plane array optical data outputs for higher speed and data throughput and continue radiation tolerance characterization of photonic devices. Continue development of alternative infrared focal plane array materials and device architectures. Complete development of resilient scanning and staring digital focal plane arrays. Initiate development and assessment of event based sensing concepts and hardware. Initiate development of high dynamic range, laser hardened 8192 x 8192 pixels, 10 micron pixel pitch focal plane arrays.
Title: Space Elecronics Research
Descriptionevelop technologies for space-based payload components such as radiation-hardened electronic devices, microelectro-mechanical system devices, and advanced electronics packaging.
FY 2021 Plans:
Continue leadership role in Deputy Assistant Secretary of Defense Systems Engineering trusted and assured microelectronics strategy efforts by development of trusted manufacturing techniques that reduce risk to National Security Space systems.
Improving benchmarking capabilities on state-of-the-art electronics using latest spacecraft algorithms and transitioning results to acquisition community to enable data-informed payload architecture design decisions. Initiating complete space qualification
planning for next generation space processor and begin implementing plan. Continue development of alternative memory approaches for high density memory needed for next-generation space systems. Continue research and development of ultra-low power and neuromorphic/cortical processing architectures to enable game-changing capabilities in future National Security Space systems. Continue advanced transistor research and development, and transitioning techniques to mainstream manufacturing.
FY 2022 Plans:
Continue leadership role in Deputy Assistant Secretary of Defense Systems Engineering trusted and assured microelectronics strategy efforts to develop trusted manufacturing techniques that reduce risk to National Security Space systems. Continue adapting bench-marking capabilities on new electronics using the latest spacecraft algorithms and transitioning bench-marking capabilities and results to the acquisition community to enable data-informed payload architecture design decisions. Complete space qualification planning for next generation space processor. Complete development of alternative memory approaches. Continue research and development of ultra-low power and neuromorphic/cortical processing architectures and advanced transistor research to enable game-changing capabilities in future National Security Space systems. Initiate small satellite, high-performance processing to enable on-orbit autonomy, data fusion, and machine learning.
https://www.sbir.gov/node/1713521
Autonomous Missile Detection using Bio-Inspired Sensors
OBJECTIVE:
Develop innovative designs for a bio-inspired sensor that is optimized for autonomously detecting, identifying, tracking, and reporting dim missile threats in cluttered and noisy scenes.
DESCRIPTION:
This topic seeks innovative solutions for autonomously (i.e. without a cue from another sensor) detecting dim missile threats in cluttered and noisy scenes using passive sensors. An example application could be detection of a distant (e.g. 100 kilometers away) re-entering missile using a ground-based infrared search and track sensor. In addition to the background and sensor noise, the scene might be cluttered by moving sources to include (but not limited to) clouds, dust, precipitation, weapon effects, the sun, the moon, stars, meteors, satellite flares, auroras, birds, insects, and aircraft. Such a scene could be challenging for conventional detection approaches, and would require increased size, weight, and power (SWaP) in order to reject noise and clutter while increasing target sensitivity.
Biological vision systems are SWaP-efficient and well adapted for ignoring clutter and noise, detecting motion, and compressing visual information. A sensor that artificially emulates all or part of a biological vision system might outperform conventional sensors for detecting, identifying, tracking, and reporting dim missile threats in cluttered and noisy scenes.
This topic seeks innovative sensor designs that artificially mimic biological vision systems wherever feasible and are capable of overcoming the challenges described above. Offerors should propose complete designs, to include everything from the optics taking in the scene to the final processor outputting target reports. These designs should incorporate technologies that are projected to mature (preferably driven by commercial investments) within the next 10 years, and that would be available (as early prototypes) for experiments during Phase II.
https://ui.adsabs.harvard.edu/abs/2020AGUFMAE0120002M/abstract
Falcon Neuro—Neuromorphic Cameras for sprite and lightning detection on the International Space Station
Falcon Neuro is the latest payload to be built and flown on the International Space Station (ISS) by cadets and faculty at the Air Force Academy. The objective of the mission is to use neuromorphic cameras to detect lightning and sprites from low earth orbit. Neuromorphic cameras are a class of non-traditional imaging devices inspired by biological retinas. These devices operate in a fundamentally different imaging paradigm from conventional cameras, producing a spatio-temporal output instead of conventional frames. These cameras, also known as silicon retinas, contain specialized in-pixel circuitry to detect and generate data in response to changes in log-illumination at each pixel. Each pixel therefore operates independently and asynchronously, generating events in response to changes around their own individual setpoint. This results in a dramatic reduction in the power consumption and data output of the sensor. The asynchronous nature of these devices also allows for very high temporal resolution imaging, with the events generated on the camera with microsecond resolution. These devices therefore offer the potential to perform low-power, continuous, and high-speed imaging and without the high data rates associated with high-speed cameras. Falcon Neuro consists of two neuromorphic cameras, one nadir, and one directed ram. Both cameras work in the visible spectrum, and have a 10 degree field of view. The nadir camera will be used to detect lightning, while the ram camera is designed to detect sprites. Falcon Neuro is manifested to fly to the ISS in February of 2022. Details of the design, and expected temporal and spatial performance will be discussed.
https://www.australiandefence.com.a...tech-to-deploy-on-international-space-station
WSU and RAAF tech to deploy on International Space Station
Conventional camera technology was not designed to capture very bright, rare and fast moving events. But WSU’s neuromorphic cameras operate more like a photoreceptor in the eye than a conventional camera. All the pixels in these cameras work like single individual cameras that don’t interact or depend on one another.
To capture an event that is moving incredibly fast (for example a Sprite or hypersonic missile) a conventional camera would need to capture many large pictures extremely regularly resulting in terabytes of data that then need analysing.
“Developing, deploying and operating world-first sensors on the International Space Station in a collaborative project with the US Air Force Academy is an outstanding accomplishment,” Prof Durrant-Whyte said. “This is another example of innovative technology being supported and developed in NSW and rapidly deployed on the world stage.”
In the future A/Prof Cohen sees no technical reason why Falcon Neuro 2.0 cannot also observe the Earth, through clouds to track ships, aircraft, missile launches and explosions, and possibly submarines.
https://player.whooshkaa.com/episode?id=759413
Neuromorphic sensors, Plan Jericho and the linkage between Western Sydney University and the RAAF
We start the episode by discussing neuromorphic imaging and its applications in the field of astronomy, the concept demonstrator that's been developed within a shipping container and the project's involvement with Plan Jericho, including how Greg's team made the connection with the RAAF. Greg walks us through how the initial meeting with Plan Jericho lead to the concept demonstration program and how that, in turn, developed the concept into the field of deployable Space Situational Awareness.
Following a discussion of other opportunities for neuromorphic vision systems, Lyle explains the purpose behind the RAAF's Project Jericho and provides an example of one of the many projects they're currently progressing. He also notes Project Jericho's use of rapid prototyping, its links into academia and the EDGY Air Force initiative that helps to identify opportunities for new solutions to existing issues.
https://news.defence.gov.au/tags/plan-jericho
https://news.defence.gov.au/technology/capturing-data-faster-speeding-bullet
Capturing data faster than a speeding bullet
The prototype MANTIS (Mutual-Axis Neuromorphic Twin Imaging System) sensor is the result of the work of the University of Sydney Nano Institute and Air Force’s Jericho Disruptive Innovation. Despite being developed through an Air Force partnership, MANTIS will be tested by all three services to explore how additional sensor diversity can provide Defence with an edge. Future iterations of MANTIS could also see it combined with a robotic eye to allow for surveillance of large portions of airspace looking for air vehicles passively driving around.
https://www.lockheedmartin.com/en-us/news/features/2017/plan-jericho-giving-australia-the-edge.html
Working seamlessly with our US labs, including Skunk Works, the Advanced Technology Centre, and The Lighthouse, STELaRLab is a conduit for advancing our home-grown Australian technologies, and will draw upon the backbone of our world-leading R&D capabilities. STELaRLab researchers will explore several fields, including: Hypersonics research; Quantum Information Science; Space Systems research; Radar and signal processing; and C4ISR systems research.
https://www.quantumventura.com/
Quantum Ventura Inc is essentially a technology innovation company with a single mission of delivering customer-centric advanced solutions to US Federal & State Governments and Private Sector customers.
Our Core offerings:
Artificial Intelligence / Machine Learning: We specialize in developing tools and applications using CNN, RNN, Reinforcement Learning, Denoising Auto-encoders, Generative Adversarial Networks(GAN) and Bayesian Anomaly Detection Algorithms in the areas of cyber-security, Automated Vehicle Tracking, Real-time Video Analytics, Sensor Fusion, Cognitive Computing, Synthetic Data Generation and other Computer-Vision driven applications.
Our Current Federally-Funded Research Projects as the Prime Contractor:
Missile Defense Agency: "Hypersonic Threat Detection" using bio-inspired processing, neuromorphic computing and Advanced AI. (Phase 1 STTR)
Partners: University of Florida and Lockheed Martin.
Department of Energy: "Cyber threat-detection using neuromorphic computing" - SBIR Phase 1
https://govtribe.com/award/federal-contract-award/definitive-contract-hq086021c7058
Definitive Contract HQ086021C7058 is a Firm Fixed Price Federal Contract Award. It was awarded to Quantum Ventura, Inc. on Apr 27, 2021. The definitive contract is funded by the Missile Defense Agency (DOD). The potential value of the award is $124,962.
Our Summary
STTR PHASE I RESEARCH & DEVELOPMENT: BIO-INSPIRED SENSORS
https://govtribe.com/award/federal-grant-award/project-grant-desc0021562
Awarded Vendor
Quantum Ventura, Inc. - 7K3W2
Project Grant DESC0021562. Funded by the Office of Science (DOE). Awarded to Quantum Ventura, Inc.. Awarded on Feb 22, 2021. CFDA 81.049 - Office of Science Financial Assistance Program
Our Summary
REALTIME NEUROMORPHIC CYBER-AGENTS (CYBER-NEURORT)
https://science.osti.gov/-/media/sb...e-I-Release-1Award-Listing01282021.xlsx?la=en
Quantum Ventura, Inc.
$ 250,000
Realtime Neuromorphic Cyber-Agents (Cyber-NeuroRT)
To process large volumes of data in real-time and detect cyberattacks quickly at speeds 30x faster than traditional machine learning networks, we propose to develop a real-time HPC-scale neuromorphic cyber agent called Cyber-NeuroRT using latest neuromorphic processors. Our cyber monitoring tool, a combination of software cum hardware appliance, will predict and alert cybersecurity threats and warnings in real-time.
https://comptroller.defense.gov/Por...DTE_Vol3_OSD_RDTE_PB20_Justification_Book.pdf
Title: Event-Based Sensing for Space & Directed Energy Applications (Air Force)
Description:
FY 2019 New Start - This project comparatively tests neuromorphic imaging technology and algorithms. This technology enhances daytime ground/space-based space situational awareness and directed energy test and evaluation. If successful, the resulting prototype will enhance ground-based space situational awareness and the technology will be inserted into further space-based situational awareness technology development.
FY 2019 Plans:
Test article cameras are to be received in 3Q FY 2019. Design for integration of camera into sensor applications is to be completed in 4Q FY 2019. This project continues in FY 2020 with FY 2020 funds.
FY 2020 Plans:
Application-specific testing will occur throughout FY 2020. Tests are to be completed by 4Q FY 2020 with a closeout report.
https://www.cnbc.com/2020/07/12/why-america-needs-to-bring-ai-into-the-upcoming-hyperwar-to-win.html
Op-ed: Hyperwar is coming. America needs to bring AI into the fight to win — with caution
Hyperwar, or combat waged under the influence of AI, where human decision making is almost entirely absent from the observe-orient-decide-act (OODA) loop, already is beginning to intrude on military operations.
Now is the time for the U.S. and China to have the hard conversations about norms of behavior in an AI enabled, hyperwar environment. With both sides moving rapidly to field arsenals of hypersonic weapons, action and reaction times will become shorter and shorter and the growing imbalance of the character and nature of war will create strong incentives, in moments of intense crisis, for conflict not peace. This is foreseeable now, and demands the engagement of both powers to understand, seek, and preserve the equilibrium that can prevent the sort of miscalculation and high-speed escalation to the catastrophe that none of us wants.
https://www.navysbir.com/n20_2/N202-108.htm
Modeling Neuromorphic and Advanced Computing Architectures
OBJECTIVE: Develop a software tool to optimize the signal processing chain across varioussensors and systems, e.g., radar, electronic warfare (EW),electro-optical/infrared (EO/IR), and communications, that consists of functionalmodels that can be assembled to produce an integrated network model used topredict overall detection/classification, power, and throughput performance tomake design trade-off decisions.
DESCRIPTION: Conventional computing architectures are running up against a quantum limit interms of transistor size and efficiency, sometimes referred to as the end ofMoore’s Law. To regain our competitive edge, we need to find a way around thislimit. This is especially relevant for small size, weight, and power(SWaP)-constrained platforms. For these systems, scaling Von Neumann computingbecomes prohibitively expensive in terms of power and/or SWaP. Biologicallyinspired neural networks provide the basis for modern signal processing andclassification algorithms. Implementation of these algorithms on conventionalcomputing hardware requires significant compromises in efficiency and latencydue to fundamental design differences. A new class of hardware is emerging thatmore closely resembles the biological neuron model, also known as a spikingneuron model; mathematically describing the systems found in nature and maysolve some of these limitations and bottlenecks. Recent work has demonstratedperformance gains using these new hardware architectures and have shownequivalence to converge on a solution with the same accuracy
https://www.navysbir.com/n20_2/N202-099.htm
Implementing Neural Network Algorithms on Neuromorphic Processors
OBJECTIVE: Deploy Deep Neural Network algorithms on near-commercially available Neuromorphicor equivalent Spiking Neural Network processing hardware.
DESCRIPTION:Biological inspired Neural Networks provide the basis for modern signalprocessing and classification algorithms. Implementation of these algorithms onconventional computing hardware requires significant compromises in efficiencyand latency due to fundamental design differences. A new class of hardware isemerging that more closely resembles the biological Neuron/Synapse model foundin Nature and may solve some of these limitations and bottlenecks. Recent workhas demonstrated significant performance gains using these new hardwarearchitectures and have shown equivalence to converge on a solution with thesame accuracy [Ref 1]. Themost promising of the new class are based on Spiking Neural Networks (SNN) andanalog Processing in Memory (PiM), where information is spatially andtemporally encoded onto the network. A simple spiking network can reproduce thecomplex behavior found in the Neural Cortex with significant reduction incomplexity and power requirements [Ref 2]. Fundamentally, there should be nodifference between algorithms based on Neural Network and current processinghardware. In fact, the algorithms can easily be transferred between hardwarearchitectures [Ref 4]. The performance gains, application of neural networksand the relative ease of transitioning current algorithms over to the newhardware motivates the consideration of this topic.
https://www.aerotechnews.com/blog/2...nding-against-missiles-far-faster-than-sound/
Hypersonics: developing and defending against missiles far faster than sound
The speed and range of hypersonic weapons has taken the task of defending against them into space, where orbiting sensors can detect a launch the moment it happens — or even before. “Every minute counts when it comes to hypersonics,” said Rob Aalseth, mission area director for Missile Warning and Defense at Raytheon Intelligence & Space, a Raytheon Technologies business. His team is developing a broad set of space technologies to detect, track and intercept hypersonic weapons at all phases of flight.
“Latency is the key parameter to address when defeating hypersonic threats. It affects every other parameter,” Aalseth said. On that front, Raytheon Intelligence & Space uses advanced missile detection and tracking algorithms that can perform highly precise missile track processing onboard the satellite in orbit.“ All of the other systems to date have had to send the data to the ground for processing,” Aalseth said. “That wastes time.”
https://patentscope.wipo.int/search/en/detail.jsf?docId=US330048233&tab=NATIONALBIBLIO
US11063667 - Systems, devices, and methods for optical communication
Applicants Raytheon BBN Technologies, Corp.
A technology is described for optical communication. An example of the technology can include receiving an event stream containing indications of independent events detected by pixels in an event camera. An event may be a change in brightness detected by a pixel in the pixel array, and the pixel independently generates an indication of the event in response to detecting the event. The event stream can be demultiplexed into a plurality of commun ication streams containing related events associated with a plurality of communication sources. The events contained in a communication stream can be aggregated based in part on an event proximity and an event time that associates an event with other events contained in the event stream. The plurality of communication streams can be demodulated to extract optically transmitted information from the plurality of communication streams, which can be sent to a data Consumer
The optically-modulated signals cause asynchronous pixel activations to occur and an associated change of brightness to be detected, each of which is encapsulated as an event by associated readout electronics and tagged with a corresponding pixel location and timestamp (e.g., E(x,y,t)). One or more event streams are sent to the event processors (e.g., event processor(s) 104 such as shown in FIG. 1),
---
Neutron-Induced, Single-Event Effects on Neuromorphic Event-Based Vision Sensor: A First Step and Tools to Space Applications
Authors:
Seth Roffe; Himanshu Akolkar; Alan D. George; Bernabé Linares-Barranco; Ryad B. Benosman
Abstract
This paper studies the suitability of neuromorphic event-based vision cameras for spaceflight and the effects of neutron radiation on their performance. Neuromorphic event-based vision cameras are novel sensors that implement asynchronous, clockless data acquisition, providing information about the change in illuminance with sub-millisecond temporal precision. These sensors have huge potential for space applications as they provide an extremely sparse representation of visual dynamics while removing redundant information, thereby conforming to low-resource requirements. An event-based sensor was irradiated under wide-spectrum neutrons at Los Alamos Neutron Science Center and its effects were classified. Radiation-induced damage of the sensor under wide-spectrum neutrons was tested, as was the radiative effect on the signal-to-noise ratio of the output at different angles of incidence from the beam source. We found that the sensor had very fast recovery during radiation, showing high correlation of noise event bursts with respect to source macro-pulses. No statistically significant differences were observed between the number of events induced at different angles of incidence but significant differences were found in the spatial structure of noise events at different angles. The results show that event-based cameras are capable of functioning in a space-like, radiative environment with a signal-to-noise ratio of 3.355. They also show that radiation-induced noise does not affect event-level computation. Finally, we introduce the Event-based Radiation-Induced Noise Simulation Environment (Event-RINSE), a simulation environment based on the noise-modelling we conducted and capable of injecting the effects of radiation-induced noise from the collected data to any stream of events in order to ensure that developed code can operate in a radiative environment. To the best of our knowledge, this is the first time such analysis of neutron-induced noise has been performed on a neuromorphic vision sensor, and this study shows the advantage of using such sensors for space applications.
Bernabé Linares-Barranco:
Method, digital electronic circuit and system for unsupervised detection of repeating patterns in a series of events
Inventor
Jacob Martin
Amir Reza YOUSEFZADEH
Simon Thorpe
Timothée MASQUELIER
Bernabe LINARES-BARRANCO
Abstract
A method of performing unsupervised detection of repeating patterns in a series of events, includes a) Providing a plurality of neurons, each neuron being representative of W event types; b) Acquiring an input packet comprising N successive events of the series; c) Attributing to at least some neurons a potential value, representative of the number of common events between the input packet and the neuron; d) Modify the event types of neurons having a potential value exceeding a first threshold TL; and e) generating a first output signal for all neurons having a potential value exceeding a second threshold TF, and a second output signal, different from the first one, for all other neurons. A digital electronic circuit and system configured for carrying out such a method is also provided
https://www.asx.com.au/asxpdf/20170320/pdf/43gxg7g8c6xq25.pdf
Method, digital electronic circuit and system for unsupervised detection of repeating patterns in a series of events, European Patent Office EP17305186 Feb 2017, Amirreza Yousefzadeh, Bernabe Linares-Barranco, Timothee Masquelier, Jacob Martin, Simon Thorpe, Exclusive Licensed to the Californian start-up BrainChip.