BRN - NASA

D

Deleted member 118

Guest
I know we are not power hungry, Rocket, but do we limit throughput?

If you don't have dreams, you can't have dreams come true!
 
  • Like
Reactions: 1 users

BaconLover

Founding Member
 
  • Like
  • Fire
Reactions: 5 users
D

Deleted member 118

Guest
  • Like
Reactions: 2 users

stuart888

Regular
They cannot solve: Which Elephant is it?

Brainchip can give them a call. At about the 24 minute mark she talks about the solutions out there and their problems.

All the stuff Brainchip Akida SNN can go.



1676076855388.png


1676076733372.png
 
  • Like
Reactions: 2 users

dippY22

Regular
  • Like
Reactions: 1 users

stuart888

Regular
Entertainment near NASA Daytona, not SNN. Kitty lovers will really get a thrill.

This guy is exceptional. And the kitties too!! 🐈‍⬛🐈‍⬛🐈‍⬛

 
D

Deleted member 118

Guest

114EA32C-25CF-449C-A967-F63442012E67.png

Future intelligent autonomous vehicles like the Orb/eVTOL/UAM (Electric Vertical Takeoff and Landing/Urban Air Mobility) vehicles will be able to “feel”, “think”, and “react” in real time by incorporating high-resolution state-sensing, awareness, and self-diagnostic capabilities. They will be able to sense and observe phenomena at unprecedented length and time scales allowing for superior performance in complex dynamic environments, safer operation, reduced maintenance costs, and complete life-cycle management. Despite the importance of vehicle state sensing and awareness, however, the current state of the art is primitive as well as prohibitively heavy, expensive, and complex. Therefore, new Fly-by-Feel technologies are required for the next generation of intelligent aerospace structures that will utilize AI to sense the environmental conditions and structural state, and effectively interpret the sensing data to achieve real-time state awareness to employ appropriate self-diagnostics under varying operational environments. Acellent is teaming with Stanford University, USAF and The Boeing Company in this STTR project to develop a Fly-by-Feel (FBF) autonomous system to significantly enhance agility of drones by integrating directly on the wings a nerve-like stretchable multimodal sensor network with AI-based state sensing and health diagnostic software to mimic the biological sensory systems like birds. Once integrated with the wings, the distributed sensor data will be collected and processed in real-time through AI-based diagnostics for flight-state estimation in terms of lift, drag, flutter, angle of attack, and damage/failure of the component in real time so that the system can interface with the controller to significantly enhance the maneuverability and survivability of the vehicle. Phase 1 focused on manufacturing, integrating and testing the network in the laboratory environment. Phase 2 program will mature the technology to TRL 5-6 via integration with a UAV to flight test the complete UAV in a flight regime using a wind tunnel.
 
  • Like
  • Fire
Reactions: 5 users
D

Deleted member 118

Guest
Can you break that down in layman's terms for me mate.

Not a hope in hell I just typed in TENN and I don’t even know what that is, just seem it posted since Akida 2000 been discussed
 
  • Haha
  • Like
  • Love
Reactions: 7 users
Not a hope in hell I just typed in TENN
All good mate, my brain would probably overheat with an explanation anyway.
Cheers for posting.
 
  • Like
Reactions: 3 users
D

Deleted member 118

Guest
Last edited by a moderator:
  • Like
  • Love
Reactions: 4 users
D

Deleted member 118

Guest
New round of phase 1 funding available that closes in 2 days and becomes available from June. Plenty in there to include Akida technology if you click on the focus areas.

 
Last edited by a moderator:
  • Like
  • Fire
  • Love
Reactions: 8 users
D

Deleted member 118

Guest

Proposal seeks to build and test a Persistent wide field-of-view (FOV) infrared (IR) camera array for detecting and tracking dim objects. The optical design is based on the successful Tau PANDORA (visible) array systems, now in use for tracking low earth orbit (LEO) and geostationary earth orbit (GEO) satellites. In Phase I, propose to collect daytime shortwave IR (SWIR) imagery on satellites with a single camera. Analyzed this data to demonstrate quantitatively the successful use of Convolutional Neural Networks (CNNs) to identify satellites distinct from the clutter. Tau proposes to construct a prototype sensor array to demonstrate the feasibility of utilizing Neuromorphic Machine Learning (NML). The primary research in Phase II will be directed towards developing an effective training algorithm of Spiking Neural Networks (SNN), an advanced Neuromorphic network, while we continue on present path by improving the Generation II Neural Networks previously investigated. Approved for Public Release | 22-MDA-11215 (27 Jul 22)
 
  • Like
  • Fire
Reactions: 10 users
D

Deleted member 118

Guest
  • Like
Reactions: 1 users
D

Deleted member 118

Guest
  • Like
Reactions: 1 users
Top Bottom