BRN - NASA

D

Deleted member 118

Guest


Black Swift Technologies LLC (BST) proposes a feasibility study that will consider our current avionics monitoring system as a baseline toward building a highly capable preventative maintenance solution based around USAF assets. We will consider improvements upon the current state-of-the art through the use of unsupervised ML algorithms to provide early warning and diagnostics of potential critical system failures on small UAS. We will gather critical data for the study from avionics data that the USAF already collects, and if this data proves insufficient, we have developed a set of monitoring nodes which we employ in our proprietary avionics that we can use to install aboard candidate platforms to supplement the data sets and implement ML algorithms for real-time analysis and feedback. BST has the capabilities to use these during this Phase I trial if stakeholders want to gather real-time data of their UAS vehicles. Primarily, we will work to obtain useful data from sources in the USAF, while performing any algorithm work on our own systems.
 
  • Like
Reactions: 7 users
D

Deleted member 118

Guest

The Army depends on radiographic inspection for non-destructive testing (NDT) of munitions and armaments to identify defects before these products reach the warfighter. Highly skilled and experienced radiographers must interpret the inspection results. This process is not only laborious and time consuming, but also subjective and inconsistent. The development of artificial intelligence (AI) opens the opportunity to develop an automated system to identify potential defects to aid the interpretation of the results and expedite the inspection process. During Phase I, Sky Park Labs demonstrated DeepNDT, a suite of machine learning algorithms that assist a radiographer in the interpretation of radiographic inspection results by automatically identifying, quantifying, and visualizing potential defects and flaws. DeepNDT uses unsupervised and supervised deep learning algorithms for automatically detecting anomalies and highlights any deviations as potential defects. Then, the system classifies the defect type and characterizes its geometry and size for a radiographer to assess and make a final decision. These algorithms were evaluated on government-furnished data and various industrial datasets, achieving real-time performance and accuracy, precision, and recall rates of over 95% for detection and classification tasks. Phase II will focus on the development of all aspects of the technology into a fully functional prototype software tool integrated into the inspection workflow. The resulting technology has the potential to reduce the cognitive load on a level II NDT technician and thereby increase the efficiency and accuracy of the assessment of parts for defects.
 
  • Like
  • Fire
Reactions: 11 users
D

Deleted member 118

Guest

CE2790D9-D263-40FC-8E7C-460874619295.png


In order to meet the Navy’s need for a spiking neural network testing platform, ChromoLogic proposes to develop a Spiking Neural Network Modeler (SpiNNMo) capable of simulating a variety of neuromorphic hardware platforms. SpiNNMo is able to extract relevant performance parameters from a neuromorphic chip and then predict the chip’s performance on new networks and data. In this way SpiNNMo can predict accuracy, latency and energy usage for a wide variety of hardware platforms on a given neural network and dataset. This will allow the Navy to test the performance of new spiking neural network architectures and chipsets before the chips are widely available and therefore speed neuromorphic adoption.


I wonder if they are talking about Akida 2000 or even 3000
 
  • Like
  • Fire
  • Wow
Reactions: 15 users

equanimous

Norse clairvoyant shapeshifter goddess
Keeping in easy to access thread

1660595617743.png


 
  • Like
  • Fire
Reactions: 14 users

stuart888

Regular
Keeping in easy to access thread

View attachment 14242

Kind of interesting all the focus on using the SNN framework to detect hypersonic missiles, space debris, and other anomalies. Very cool is event based SNN solutions for both the Sky and Outer Space!

I think this Event-Based Trajectory Prediction using SNN might be part of the Falcon Neuro project. Part of below is the "T" part of Brainchip patent bought JAST (Timothee Masquelier).

https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8180888/

1660596374824.png

1660596802953.png
 
  • Like
  • Fire
  • Wow
Reactions: 12 users
For anyone interested:


"Orion's return will be fast and hotter than any spacecraft has ever experienced on its way back to Earth.
The Orion spacecraft will travel farther than any spacecraft built for humans has ever flown, reaching 40,000 miles (64,000 kilometers) beyond the far side of the moon, according to NASA."

"Artemis I will also carry a number of science experiments, some of which will be installed once the rocket and spacecraft arrive at the launchpad."

Wonder if Akida could be a part of this space mission and what may lie ahead in the future here.
 
  • Like
  • Fire
  • Thinking
Reactions: 8 users
Rob in a past interview this year dropped that brn was helping with lunar lauch project. Cant quite recall which one. Other folk might know but this clearly stuck in my mind.
In recent abc coverage they mentioned about those testing on board.....all inline with akida particularly the mention of radiation system testing.
 
  • Like
  • Fire
Reactions: 10 users
I'm sure someone mentioned Microchip Technology recently. Found this article interesting.

https://www.jpl.nasa.gov/news/nasa-awards-next-generation-spaceflight-computing-processor-contract

This part below is a dot join to SBIR AKD1000 being made Rad Hard

In 2021, NASA solicited proposals for a trade study for an advanced radiation-hardened computing chip with the intention of selecting one vendor for development. This contract is part of NASA’s High-Performance Space Computing project. HPSC is led by the agency’s Space Technology Mission Directorate’s Game Changing Development program with support from the Science Mission Directorate. The project is led by JPL, a division of Caltech in Pasadena.

Works in with this

https://www.nasa.gov/directorates/spacetech/game_changing_development/projects/HPSC

SC
 
  • Like
  • Fire
Reactions: 12 users
D

Deleted member 118

Guest


Adapting SRT’s M1 Hardware Portal for Navy Facility Health Monitoring and Prioritization
Award Information
Agency:
Department of Defense
Branch:
Navy
Contract:
N68335-21-C-0013
Agency Tracking Number:
N202-099-1097
Amount:
$239,831.00
Phase:
Phase I
Program:
SBIR
Solicitation Topic Code:
N202-099
Solicitation Number:
20.2
Timeline
Solicitation Year:
2020
Award Year:
2021
Award Start Date (Proposal Award Date):
2020-10-07
Award End Date (Contract End Date):
2021-12-10
Small Business Information
Blue Ridge Envisioneering, Inc.
5180 Parkstone Dr. Suite 200
Chantilly, VA 20151-1111
United States
DUNS:
616396953
HUBZone Owned:
No
Woman Owned:
No
Socially and Economically Disadvantaged:
No
Principal Investigator
Name: Jason Pualoa
Phone: (571) 349-0900
Email: andrew@br-envision.com
Business Contact
Name: Edward Zimmer
Phone: (703) 927-0450
Email: ned@br-envision.com
Research Institution
N/A
Abstract
Deep Neural Networks (DNN) have become a critical component of tactical applications, assisting the warfighter in interpreting and making decisions from vast and disparate sources of data. Whether image, signal or text data, remotely sensed or scraped from the web, cooperatively collected or intercepted, DNNs are the go-to tool for rapid processing of this information to extract relevant features and enable the automated execution of downstream applications. Deployment of DNNs in data centers, ground stations and other locations with extensive power infrastructure has become commonplace but at the edge, where the tactical user operates, is very difficult. Secure, reliable, high bandwidth communications are a constrained resource for tactical applications which limits the ability to routed data collected at the edge back to a centralized processing location. Data must therefore be processed in real-time at the point of ingest which has its own challenges as almost all DNNs are developed to run on power hungry GPUs at wattages exceeding the practical capacity of solar power sources typically available at the edge. So what then is the future of advanced AI for the tactical end user where power and communications are in limited supply? Neuromorphic processors may provide the answer. Blue Ridge Envisioneering, Inc. (BRE) proposes the development of a systematic and methodical approach to deploying Deep Neural Network (DNN) architectures on neuromorphic hardware and evaluating their performance relative to a traditional GPU-based deployment. BRE will develop and document a process for benchmarking a DNN’ s performance on a standard GPU, converting it to run on near-commercially available neuromorphic hardware, training and evaluating model accuracy for a range of available bit quantizations, characterizing the trade between power consumption and the various bit quantizations, and characterizing the trade between throughput/latency and the various bit quantizations. This process will be demonstrated on a Deep Convolutional Neural Network trained to classify objects in SAR imagery from the Air Force Research Laboratory’s MSTAR open source dataset. The BrainChip Akida Event Domain Neural Processor development environment will be utilized for demonstration as it provides a simulated execution environment for running converted models under the discrete, low quantization constraints of neuromorphic hardware.
 
  • Like
  • Love
Reactions: 6 users
D

Deleted member 118

Guest
Think the above has already been posted
 

stuart888

Regular
The European Space Agency Lab just did this very in-depth Space focused video.

It is deep, and for the people following Nasa closely. The ESA and NASA are Spike Lovers for sure. :love:



1662757275077.png


1662757243702.png
 
  • Like
  • Fire
Reactions: 10 users
D

Deleted member 118

Guest

Machine Learning-Accelerated Grid Environment
Award Information
Agency:
National Aeronautics and Space Administration
Branch:
N/A
Contract:
80NSSC21C0182
Agency Tracking Number:
213026
Amount:
$124,788.00
Phase:
Phase I
Program:
SBIR
Solicitation Topic Code:
S5
Solicitation Number:
SBIR_21_P1
Timeline
Solicitation Year:
2021
Award Year:
2021
Award Start Date (Proposal Award Date):
2021-05-12
Award End Date (Contract End Date):
2021-11-19
Small Business Information
EMERGENT SPACE TECHNOLOGIES, INC.
7901 Sandy Spring Road, Suite 511
Laurel, MD 20707-3589
United States
DUNS:
101537046
HUBZone Owned:
No
Woman Owned:
No
Socially and Economically Disadvantaged:
No
Principal Investigator
Name: Brett Carver
Phone: (301) 345-1535
Email: brett.carver@emergentspace.com
Business Contact
Name: Everett Cary
Phone: (301) 345-1535
Email: everett.cary@emergentspace.com
Research Institution
N/A
Abstract
NASA satellites are generating over 4TB of data each day. Analyzing this data in-orbit is becoming increasingly important for the purposes of accelerating scientific discovery and enabling opportunistic science. State-of-the-art artificial intelligence (AI) and machine learning (ML) data science applications require significant resources to run computationally intensive algorithms and models. To facilitate intensive data analysis in a resource constrained environment such as space, we need to utilize resources efficiently and at scale. Current solutions to this problem require downlinking full datasets to perform ground-based processing or running low computational footprint algorithms that are less effective than state-of-the-art solutions. In this proposal, we explore the capabilities and benefits of developing MAGE (ML Accelerated Grid Environment). MAGE is a software framework and API that facilitates ML training and inference distributed across a networked constellation or swarm of satellites to enable resource intensive ML models to run at the extreme edge. This solution makes complex data processing at the edge possible by running on AI accelerated hardware and distributing ML processing and storage across a grid of compute and storage nodes. Collectively, these nodes comprise a grid computing environment that can be tasked by spacecraft to run resource intensive applications. MAGE reduces the need to downlink full data sets, allows prioritization of data downlinking, enables proliferation of complex autonomous space-based systems, and provides a mission agnostic environment for processing and storage. Utilizing a system such as MAGE would allow NASA to perform efficient, scalable, mission agnostic AI and ML processing at the edge for any scientific mission.

CA5F7B79-9880-4B20-8F30-DACA2935C77B.png
 
Last edited by a moderator:
  • Like
  • Fire
Reactions: 7 users
D

Deleted member 118

Guest
Nice to start seeing some of these projects completed

 
  • Like
  • Fire
Reactions: 5 users
D

Deleted member 118

Guest


Project Introduction
Makel Engineering, Inc. proposes to develop a highly compact Environmental Microsensor Array (EMMA) as a payload for free flying Intra-Vehicular Activity (IVA) robots, such as NASA’s Astrobee, supporting the Integrated System for Autonomous and Adaptive Caretaking (ISAAC) project. EMMA will include machine learning to translate and interpret onboard sensor data (e.g., chemicals, temperature, humidity, pressure, etc.) within the context of planetary facilities. Planned human exploration beyond Earth orbit will rely on an orbiting facility near the Moon, called Gateway, with intermittent human occupation, requiring robust autonomous inspection and diagnostics tools. EMMA’s sensors and machine learning algorithms will establish nominal background conditions throughout the vehicle, to identify anomalies and trigger further action. The use of machine learning tools will enable EMMA to recognize changes in patterns and decide if additional investigation is granted, e.g., if hot spot detected indicating a potential fire, use chemical sensors to classify material type and further isolate the source of fire, enabling corrective action (e.g., selectively shutting down affected systems.). Phase I identified Gateway use cases and defined EMMA’s relevant requirements. EMMA was deployed onboard COTS autonomous floor cleaning robots navigating rooms for data collection and deployed towards simulated fault conditions. Phase I machine learning algorithm was deployed and demonstrated with the data collected by EMMA. Phase II will mature the technology taking from lab demonstration to a series of flyable prototypes which will be delivered to NASA for testing in ongoing microgravity experiments and integration with Astrobee at NASA ground-based testbeds at ARC and JSC during the program. The machine learning algorithms will be migrated to the embedded processors, resulting in a standalone prototype system.
Anticipated Benefits
EMMA onboard Astrobee robotic flyer for deployment in NASA’s proposed Gateway during manned and unmanned periods for regular inspection and diagnostics, supporting ISAAC’s mission. Example uses:
-Early fire detection and outgassing from electrical insulators (e.g. CO, CO2, VOCs, HCl, HCN, HF, non-contact temperature to monitor overheating) -Identification and tracking of slow leaks and fast leaks (pressure change, chemicals: H2, NH3, VOCs)
-Air quality monitoring to ensure health of revitalization systems (CO2, O2, humidity, trace contaminants)
EMMA on flying/crawling drones used in large, high rise office buildings, surveying air quality to ensure workplace safety. EMMA could be used for robotic leak detection and air quality in chemical plants and mining operations. Comprehensive, low cost, and near real time inspection and diagnosis of systems (e.g., ventilation, utilities) in health care facilities to mitigate disease outbreaks.
 
  • Like
  • Fire
Reactions: 4 users

Andy38

The hope of potential generational wealth is real
Just a screenshot from TheFuturist (mail out), just a general wrap of new technology that is evolving. You’d be thinking Akida’s powers would be well served for a mission like this on Mars!
96014511-A2E3-47B7-8D4B-DDB948F23794.png
 
  • Like
  • Fire
Reactions: 10 users
D

Deleted member 118

Guest

AA0DFEB4-7D29-4214-9930-D87FAF9A5F1F.png


Description:

RT&L FOCUS AREA(S): Artificial Intelligence (AI)/Machine Learning (ML);Autonomy TECHNOLOGY AREA(S): Air Platforms;Battlespace Environments;Information Systems OBJECTIVE: Develop a capability to autonomously generate mission plans for onboard Unmanned Aerial Systems (UAS) in support of Intelligence, Surveillance, and Reconnaissance (ISR) missions by applying artificial intelligence (AI) and machine learning (ML) techniques. DESCRIPTION: With today's advances in software and hardware, autonomous operation is a capability, even if still somewhat disruptive, that is fully realizable as highlighted in references 1–6. In fact, autonomous operation is becoming a critical capability in order to stay ahead of our adversaries. But there are other reasons for autonomous systems [Ref 2], such as "when the world can’t be sufficiently specified a priori" and "when adaptation must occur at machine speed". It also makes a good case for AI, which enables significant autonomy and includes learning, reasoning, introspection, decision making, and much more. Exploiting unmanned systems autonomous mission planning is the next stage in enhancing the capabilities of these systems in the operational environments. This project’s success relies on utilizing sophisticated software solutions including machine intelligence/learning and modern computer hardware or graphics processing units (CPUs/GPUs – a scaled version of a workload-optimized massively parallelized computer). It should be evident that the size of unmanned aerial vehicles (UAVs) (Groups 1-5) and the types of missions will impact the overall mission planning requirements and complexity. The goal is to be entirely autonomous; however, in particular with Group 4-5 systems, embedding trust/risk capabilities and detailed contingency plans in autonomous operation—if unacceptable behavior is detected—is as critical as meeting mission success. Even within autonomous operations, there will still be means to alert the Common Control System operator via the envisioned tool that monitors trust embedded on the platform. With these risk mitigations capabilities, the goal of this project will focus on ISR collection – a more simplistic mission when compared to a strike execution mission, which would in the future add considerable levels of mission complexities. All UAVs will have the necessary sensors and flight control systems to embed the software to generate autonomous missions from takeoff (flight plan and mission plan) to landing, while completing missions including collection and dissemination of ISR data, i.e., when connectivity is available. It is anticipated that activity-based intelligence and/or other relevant information will start the components-based planning process to determine a suitable platform; route planning, types of sensors in support of ISR collection and sensor collection requirements to generate an entire flight plan with associated requirements; and when to disseminate data. Note that many route planning and resource management algorithms exist, thus any solution should include the ability to adaptively change a particular part of the overall planning process. It should also include consideration for automated contingency plans and dynamic replanning capabilities due to various unexpected factors, such as weather, change in mission requirements, etc. These fully autonomous, mission planning service capabilities must be able to be integrated into the Next-Gen Navy Mission Planning System (NGNMPS) and be shared with the Common Control Systems operator with any available communication system with the ability to be modified if necessary, and more importantly, to actually realize the autonomous behavior be embedded on board the platform. Due to the autonomous plan to be initially shared NGNMPS and CCS operator, it will be necessary to define how the plan is presented to the operators. Finally, in order to meet mission requirements, the solution needs to specify CPU/GPU requirements to achieve as close to real-time performance as possible; and to paraphrase the Heilmeier Catechism exams for success [Ref 11], it will be essential to understand “how to eventually test, verify and evaluate the overall accuracy and performance of the autonomous mission planning process” that need to be addressed as part of this development effort. PHASE I: Generate a concept of autonomous mission planning from launch to execution of mission specific requirements (ISR as specified in a tasking order and other data such as activity based intelligence data) to data dissemination, and finally, to return to base. This mission plan may also be an airborne modification (dynamic replanning) to the current mission, applying artificial intelligence techniques. Mission plans will take into consideration threat and friendly disposition, weather, terrain, and any onboard sensor (collection) requirements and limitations. In addition the concept needs to outline required hardware to achieve real-time or near real-time processing capabilities. The Phase I effort will include prototype plans to be developed under Phase II. The overall solution should outline data sources and information that will be required to successfully generate mission plans. It is also required to take into account STANAG processes and procedures to minimize proprietary solutions. PHASE II: Develop a prototype software solution that can be tested in a simulated mission environment. In Phase II, the program office will provide additional details about the platforms and sensors characteristics and other vital data critical in support of a realistic prototype development. PHASE III DUAL USE APPLICATIONS: Finalize the prototype version. Perform final testing and verification in a simulated environment and potentially in a real environment using a surrogate vehicle. Transition to naval platform. Companies such as Amazon, and similar delivery companies that have already started drone-based package delivery, would benefit from this development. FEDEX and UPS would benefit in terms of using large UAVs for package deliveries from large collection centers to smaller distribution centers.
 
  • Like
Reactions: 5 users
Top Bottom