Apols Rise
Was rush shut down n get out of office for a meet when posted.
As below.
sbir.nasa.gov
This extract from Intellisense Systems Inc website is very interesting given all the Brainchip discussion around sensor fusion:
“One of the core abilities of Intellisense Systems is the integration of multiple sensors and technologies into compact, self-contained units. The Micro Weather Station (MWS®) combines eight weather and environmental sensors into a rugged package that is completely self-powered and still weighs less than 4 pounds. Similarly, the AWARE Flood System enables the integration of pressure transducers, precipitation gauges, and water-level radar into a single, cost-effective communications module. Moreover, Intellisense has been developing and delivering sensor-fusion systems that integrates EO/IR, lidar, ultrasonic, terahertz, radio frequency, and mmWave sensors with embedded processors and algorithms for our customers.
Additionally, the engineers at Intellisense are incorporating deep-learning algorithms into their work to identify changes and threats in the environment to further improve these potentially life-saving applications. These capabilities ensure that multi-sensor fusion projects like the MWS and AWARE Flood System can assist users in making faster, more informed decisions in critical situations, like disaster response, rescue operations, and combat scenarios.
When the U.S. Department of Homeland Security needed a portable device that could identify humans or vehicles that may be hiding in the wilderness through adverse weather conditions in both daylight and nighttime conditions, Intellisense applied its skills in multi-sensor fusion to create the Infrared and Optical Wilderness Location and Surveillance system, or IROWL for short. IROWL consists of a multispectral optical design, which integrates a high-acuity visible and near-infrared (VIS/NIR) sensor, an infrared thermal sensor, and a full-color camera. All these elements combine to better identify, range, and track individuals in wilderness terrain in both day and night, as well as in bad weather conditions.
The VIS/NIR sensor can differentiate humans and animals from the surrounding wilderness, and the thermal (mid-wave infrared) sensor provides the ability to see through dust, fog, smoke, and other atmospheric obscurants. It can also unveil humans or vehicles that may be hidden in creek beds, shrubs, trees, or around man-made structures. The dual-band, high-resolution imagery not only allow the user to quickly distinguish humans from animals, but also determine what items individuals might be carrying. This is made possible by the deep-learning convolutional neural network (CNN) to identify body armor and differentiate weapons from other objects in a person’s hands.
The system also incorporates a laser rangefinder that can accurately determine the distance of an individual or object up to a mile away. The system works in the same manner as many other range finders; a laser is emitted from the source, and the distance is calculated by the time taken to reach and be reflected off a target. But its low-power consumption and light weight means that it can be integrated in a multi-sensor solution that is completely wireless and still be hand-carried out into the field.
IROWL integrates components and sensors that provide situational parameters as well, including a compass and GPS coordinates. These coordinates can report the location of individuals being observed. By integrating this system’s data with a smartphone interface, the IROWL imagery and geolocation data can be shared with other servicemembers both in the field or at headquarters.
This development has a number of important military and private applications, particularly for border patrol and search and rescue. If an individual needs rescue in a fast-moving river in nighttime conditions, this portable system can identify the person with its infrared thermal sensor and determine their distance with the laser rangefinder. This application of multi-sensor fusion not only can bolster security for the military, but also save lives in the world’s most dangerous wilderness. This is just one example of many sensor-fusion developments currently going on at Intellisense Systems”
This is definitely the sort of thing no one is going to share with us via the ASX or any other platform. I would definitely say it fits that category of things Rob Telson said they were doing with NASA but are not allowed to talk about.
My opinion only DYOR
FF
AKIDA BALLISTA