BRN - NASA

Event-Based Sensing and Navigation Technologies (EBSNT)​

Award Information
Agency:National Aeronautics and Space Administration
Branch:N/A
Contract:80NSSC23PB623
Agency Tracking Number:232282
Amount:$149,634.00
Phase:phase I
Program:SBIR
Solicitation Topic Code:H6
Solicitation Number:SBIR_23_P1
Timeline
Solicitation Year:2023
Award Year:2023
Award Start Date (Proposal Award Date):2023-08-01
Award End Date (Contract End Date):2024-02-02
Small Business Information
VISSIDUS TECHNOLOGIES INC
590 Lipoa Pkwy, Ste 224
Kihei, HI 96753-6911
United States
DUNS:079345244
HUBZone Owned:Yes
Woman Owned:Yes
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Bogdan Udrea
Phone: (206) 227-8075
Email: bogdan.udrea@vissidus.com
Business Contact
Name: Rachel Campbell
Phone: (386) 386-8682
Email: rachel.campbell@vissidus.com
Research Institution
N/A
Abstract
The Event-Based Sensing and Navigation Technologies (EBSNT) project aims to develop, integrate, and test a perception and planning avionics suite and associated on-board software in response to NASArsquo;s Subtopic ldquo;H6.22 Deep Neural Net and Neuromorphic Processors for In-Space Autonomy and Cognitionrdquo; within the scope of ldquo;Neuromorphic Software for Cognition and Learning for Space Missions.rdquo; EBSNT forms the foundation of anbsp; Relative Pose Determination and Control Subsystem (RPDCS) integrated on board a Servicer Space Vehicle (SSV) that performs Rendezvous, Proximity Operations, and Docking (RPOD) with a Client SV (CSV). The RPDCS uses a hybrid architecture
of event-based sensors, interconnected Spiking Neural Networks (SNNs) and traditional sensors, actuators, and von Neumann computers that enables the SSV to autonomously i) determine the relative pose and pose rate, i.e., relative translation and rotation velocities, between the SSV and a goal feature on the CSV; ii) plan a relative trajectory that places the SSV at a commanded relative pose with respect to the specified feature of the CSV; iii) perform translation and attitude maneuvers to acquire and maintain the commanded relative pose; and iv) monitor the health status of the SSV and take appropriate action to ensure mission safety in case of SSV component and subsystem failure.
 
  • Fire
  • Like
Reactions: 3 users

Non-Volatile Optical Memory for Zero Power Neuromorphic Computation from GenXComm, Inc.​

Award Information
Agency:National Aeronautics and Space Administration
Branch:N/A
Contract:80NSSC23PB435
Agency Tracking Number:232655
Amount:$149,688.00
Phase:phase I
Program:SBIR
Solicitation Topic Code:H6
Solicitation Number:SBIR_23_P1
Timeline
Solicitation Year:2023
Award Year:2023
Award Start Date (Proposal Award Date):2023-08-03
Award End Date (Contract End Date):2024-02-02
Small Business Information
GENXCOMM INC
10000 Metric Blvd, Suite #200
Austin, TX 78758-5208
United States
DUNS:080275934
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Taran Huffman
Phone: (310) 625-4093
Email: taran.huffman@gxc.io
Business Contact
Name: Tina Trimble
Title: Trimblet
Phone: (512) 554-7601
Email: tina.trimble@gxc.io
Research Institution
N/A
Abstract
Long term space missions can greatly benefit from neuromorphic processors enabling in-situ learning in the extreme environments in space. Photonic Tensor Cores provide extremely energy efficient and robust hardware for the computations required for neuromorphic processing. However, current technologies require the significant expenditure of energy, through resistive heaters or current injection diodes, to maintain the state of the neural network for inference. Here we describe a photonic accelerator technology that utilizes the learning from FLASH memory devices to enable non-volatile, 0 energy state retention for neuromorphic processors, enabling the possibility of reaching incredible inference efficiency of 1 femto-joule per operation ndash; a 3 order of magnitude improvement over todays GPUs.
 
  • Fire
  • Like
Reactions: 3 users
I like this one


Edge AI Platform for Space and Unmanned Aerial Imagery Intelligence​

Award Information
Agency:Department of Defense
Branch:Air Force
Contract:FA8649-23-P-0196
Agency Tracking Number:FX224-OCSO1-1566
Amount:$74,941.00
Phase:phase I
Program:SBIR
Solicitation Topic Code:X224-OCSO1
Solicitation Number:X22.4
Timeline
Solicitation Year:2022
Award Year:2023
Award Start Date (Proposal Award Date):2022-11-01
Award End Date (Contract End Date):2023-02-04
Small Business Information
Misram LLC
1 Paddington Lane
Holmdel, NJ 07733-1111
United States
DUNS:078446162
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:Yes
Principal Investigator
Name: Vidya Sagar
Phone: (201) 912-9568
Email: vidya@spectronn.com
Business Contact
Name: Rajarathnam Chandramouli
Phone: (847) 345-4731
Email: mouli@spectronn.com
Research Institution
N/A
Abstract
Global persistent awareness is critical for the Air Force. Warfare is becoming increasingly complex due to congested and contested operational environments, maneuvering targets, and rapidly changing threats. The Air Force seeks on-demand awareness of adversary actions anywhere on the globe by securely gathering, processing, and fusing multiple types of trusted data, including satellite and aerial imagery. The Joint and Allied Force networks will share processed intelligence to rapidly identify threats and targeted intelligence. Executing AI algorithms for satellite and aerial imagery fusion solely on terrestrial cloud infrastructure will incur severe latency due to limited data rate between a satellite and ground station, wireless link packet losses, delays due to revisit times, etc. Therefore, we propose to extend our commercial distributed edge AI sensor fusion software stack to satisfy the needs of Air Force customers. A 1000X latency reduction and 5X improvement in intelligence inference accuracy will be demonstrated.
 
Last edited:
  • Fire
  • Love
  • Like
Reactions: 4 users

WEB WEASELS: Autonomous Swarming AI Munitions​

Award Information
Agency:Department of Defense
Branch:Air Force
Contract:FA8658-22-C-B006
Agency Tracking Number:F2D-4696
Amount:$1,833,687.00
Phase:phase II
Program:SBIR
Solicitation Topic Code:AF22Z-PDCSO2
Solicitation Number:22.4
Timeline
Solicitation Year:2022
Award Year:2022
Award Start Date (Proposal Award Date):2022-07-27
Award End Date (Contract End Date):2023-07-31
Small Business Information
Unmanned Experts Inc.
720 Sth Colorado Blvd, Penthouse Nth
DENVER, CO 80246-1111
United States
DUNS:965164721
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Keven Gambold
Phone: (334) 717-0031
Email: kagambold@unmannedexperts.com
Business Contact
Name: Keven Gambold
Phone: (334) 717-0031
Email: kagambold@unmannedexperts.com
Research Institution
N/A
Abstract
Web Weasels (WW) is a multi-pronged RDT&E program designed to meet one of the stated goals from the USAF’s 2030 Science & Technology Strategy document: “Overwhelm adversaries with complexity, unpredictability, and numbers through a collaborative and autonomous network of systems and effects” The near-peer conflict environment is far deadlier to 4th Gen air systems and even 5th Gen can be challenged through cyber and EW methods. The next DoD ‘Offset’ calls for smart, autonomous, and connected munitions to prepare, seize and maintain air superiority over the Battlespace. WW builds on a unique swarm planning and deployment capability (Air Commons® - Swarm or AC-S) by applying cutting edge AI/ML training techniques and edge computing. WW redefines a ‘smart bomb’ to be a highly trained, thinking, reactive, and team-playing asset in the fight. In addition, WW has been built to address the AFRL 2022 SBIR requirements: “The Armament Directorate is pursuing ideas that permit Blue Forces to command various collaborative weapons systems and coordinated tactics to ensure success. A dynamic battlespace requires automated, adaptive weapons systems and cooperative tactics… Artificial intelligence algorithms with “dialable” human influence…” Squadrons of autonomous collaborative munitions operating at the far edge of the C2 infrastructure, need the training, tactics, techniques, and procedures (TTPs) to handle the speed-of-datalink environment that occurs in modern combat. Teamwork, communication, shared mental models, and a robust set of tried and tested strategies are required to survive and to dominate. The parent program, AC-S, allowed commanders to plan, task and manage multiple swarming assets through a ’Swarm ATO’ and a Swarm Engine. WW is designed to overlay AI/ML training algorithms onto AC-S to provide pre-launch munitions with a series of TTPs in a ‘Playbook’ for a given mission set (i.e., SEAD). The munitions will launch with AI-on-the-Edge hardware, firmware and comms architecture (OXYGEN and DROIDISH™) which allows them to collaboratively assess the combat environment, vote on the most suitable Play, call an ‘Audible’ change, and execute the improved plan. WW uses three teams, one building and training AI/ML Playbook algorithms; one focused on AI-on-the-Edge, voting and Audibles; and the third one testing Human-Swarm Interfaces (HSwI) configurations. WW will use AirSim to run trial iterations and Mission Rehearsals, and then fly suitably equipped small UAS (emulating autonomous stand-off munitions) against Liteye System’s SHIELD Counter-UAS systems (emulating an Enemy Force integrated radar threat) in the time-honored traditional mission sets of the original USAF Wild Weasels: “First In, Last Out”.
 
  • Fire
  • Like
Reactions: 3 users

Artificial Intelligence Driven Voice Control at the Edge (ADViCE)​

Award Information
Agency:Department of Defense
Branch:Special Operations Command
Contract:H92405239P003
Agency Tracking Number:S2D-0475
Amount:$1,224,548.00
Phase:phase II
Program:SBIR
Solicitation Topic Code:SOCOM224-D005
Solicitation Number:22.4
Timeline
Solicitation Year:2022
Award Year:2023
Award Start Date (Proposal Award Date):2023-03-01
Award End Date (Contract End Date):2024-09-03
Small Business Information
Morsecorp, Inc
101 Main Street FL 14
Cambridge, MA 02142-1111
United States
DUNS:079575595
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Nigel Mathes
Phone: (857) 999-0144
Email: sbir@socom.mil
Business Contact
Name: Sanjay Patel
Phone: (617) 320-0328
Email: sanjaypatel@morsecorp.com
Research Institution
N/A
Abstract
Hands/Off Tactical Mobile Intelligent Communications (HOT MIC) is an edge/deployed AI solution for voice command and control (VC2) of sUAS and other potential unmanned systems. Today’s SOCOM operator needs to easily enlist new capabilities to the fight such as small Unmanned Air Systems (sUAS) while reducing task saturation and keeping focus down range. An intuitive, robust, hands/free voice user interface (VUI) for VC2 of sUAS effective in combat scenarios and deployable to small form and fit mobile edge devices supports this operational challenge. HOT MIC leverages state/of/the/art AI/ML components architected for an Android smartphone with an initial deployment target on the Samsung Galaxy S20. This common operating device is carried by today’s operators and ensures inter/operability with sUAS’s by including all required software on a single platform. HOT MIC does not require custom processors or hardware and its modular architecture supports easy integration with multiple military radios, state/of/the/art NLP solutions, ATAK plug/ins, various types of PixHawk -MAVLink enabled UAS’s, and deployment to any ATAK capable device, if required. The result is a scalable, reliable, re/usable, hardware/agnostic, and secure voice interface for sUAS C2 that can be extended for future unmanned robotic mission scenarios, devices and other hands/free CONOPS.
 
  • Like
  • Fire
Reactions: 4 users

Autonomous Vehicle Intelligent Communication System (AVICS) for Air Traffic Control (ATC)​

Award Information
Agency:Department of Defense
Branch:Navy
Contract:N68335-23-C-0427
Agency Tracking Number:N231-025-0910
Amount:$139,990.00
Phase:phase I
Program:SBIR
Solicitation Topic Code:N231-025
Solicitation Number:23.1
Timeline
Solicitation Year:2023
Award Year:2023
Award Start Date (Proposal Award Date):2023-06-05
Award End Date (Contract End Date):2023-12-11
Small Business Information
OPTO-KNOWLEDGE SYSTEMS INC
19805 Hamilton Ave
Torrance, CA 90502-1341
United States
DUNS:625511050
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Tim Caber
Phone: (310) 756-0520
Email: tim.caber@oksi.ai
Business Contact
Name: Marco Romani
Phone: (415) 412-9203
Email: marco.romani@optoknowledge.com
Research Institution
N/A
Abstract
Airports are critical infrastructure that require constant maintenance to ensure safe and efficient operations. One of the major challenges that airport authorities face is the removal of foreign object debris (FOD) from runways and taxiways. FOD can cause significant damage to aircraft and pose a safety risk to passengers and crew. Therefore, it is crucial to have a reliable and efficient method for FOD removal. To address this challenge, the use of robots for FOD removal has become increasingly popular in recent years. However, to operate these robots effectively, they must be able to receive and send voice commands over air traffic control (ATC) radio frequencies. Historically, high accuracy voice transcription models under noisy transmission conditions have not been possible without large language models running within a cloud environment. However, in the past few months, the latest edge AI hardware along with state-of-the-art speech recognition models have made noisy radio speech transcription possible on edge AI processors. To address this technology gap, OKSI proposes the Autonomous Vehicle Intelligent Communication System (AVICS) for Air Traffic Control (ATC) to solve the automated FOD removal problem.
 
  • Fire
  • Like
Reactions: 3 users

This one ain’t bad either​

An Energy-efficient and Self-diagnostic Portable Edge-Computing Platform for Traffic Monitoring and Safety​

Award Information
Agency:Department of Transportation
Branch:N/A
Contract:6913G623P800056
Agency Tracking Number:DOT-23-FH2-015
Amount:$149,995.68
Phase:phase I
Program:SBIR
Solicitation Topic Code:23-FH2
Solicitation Number:6913G623QSBIR1
Timeline
Solicitation Year:2023
Award Year:2023
Award Start Date (Proposal Award Date):2023-07-13
Award End Date (Contract End Date):2024-01-12
Small Business Information
CLR ANALYTICS INC
52 Gardenhouse Way
Irvine, CA 92620
United States
DUNS:N/A
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:Yes
Principal Investigator
Name: Lianyu Chu
Phone: (949) 864-6696
Email: lchu@clr-analytics.com
Business Contact
Name: Lianyu Chu
Title: Lianyu Chu
Phone: (949) 864-6696
Email: lchu@clr-analytics.com
Research Institution
N/A
Abstract
Recent advances in technologies have shown great potential for widespread use of Artificial Intelligence (AI) techniques in real-time Intelligent Transportation Systems (ITS) applications. However, the massive amounts of data collected and generated from ITS sensors pose a major challenge in data processing and transmission. This requires a shift from centralized repositories and cloud computing to edge computing. This project proposes an integrated low-power edge-computing system to work with computation-intensive traffic sensors (e.g., video, high-resolution radar, and Lidar) and weather sensors. The system will be designed to be portable, have self-diagnostic capabilities through monitoring sensors and system operations, and send out alerts and data when necessary. The proposed system will include an edge server, which will be developed based on a System-on-Module (SoM) using the latest AI chip, and an innovative hybrid camera that integrates a regular video camera and a FLIR thermal image camera. The project will identify and implement in-situ information processing and extraction algorithms based on machine learning and deep learning techniques to classify vehicles and detect events such as vehicle crashes, the presence of stopped vehicles, pavement and environmental conditions, and wildlife. The prototype will be demonstrated at a California test site in collaboration with Caltrans.
 
Last edited:
  • Fire
  • Like
Reactions: 3 users

Monolithic SDR SoC for SATCOM​

Agency:
Department of Defense
Branch:
Defense Microelectronics Activity
Program | Phase | Year:
STTR | Phase I | 2024
Solicitation:
24.A
Topic Number:
DMEA24A-001
NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.
The official link for this solicitation is:https://www.defensesbirsttr.mil/SBIR-STTR/Opportunities/
Release Date:
November 29, 2023
Open Date:
January 03, 2024
Application Due Date:
February 21, 2024
Close Date:
February 21, 2024
Description:
TECHNOLOGY AREAS:
Electronics | Space Platforms

MODERNIZATION PRIORITIES:
Microelectronics | Space Technology

OBJECTIVE:
Development of a monolithic Radio Frequency (RF) System-on-Chip (SoC) Software Defined Radio (SDR) Integrated Circuit (IC) transceiver for satellite communications (SATCOM) meeting specific defense needs and supportive of global navigation satellite system (GNSS), internet-of-things (IoT) edge-computing, and artificial intelligence (AI) processing technologies and applications.

ITAR:
The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 3.5 of the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

DESCRIPTION:
There are few commercially available, monolithic, SDR-SoCs on the market [1][2], and those that exist, though highly capable, can be cost prohibitive and are not optimized for low size, weight, and power (SWaP) DoD applications, such as small handheld devices where ultra-low power and reduced form-factor are major design considerations. Furthermore, a review of prior awarded SBIR/STTR topics demonstrates that defense specific needs for low SWaP SDR devices have routinely been met through a modular approach of integrating commercial-of-the-shelf (COTS) devices [3]. However, further SWaP optimization can only be achieved through higher levels of SoC integration with efficient architecture execution in supportive modern semiconductor technologies that include sufficient intellectual property (IP) offerings. Though SDR construction varies, the typical SDR architecture consists of a RF frontend, a field programmable gate array (FPGA) for baseband signal processing, and microprocessor or microcontroller for SDR control and power management. While continued advances in analog-to-digital (ADC) converter and digital-to-analog converter (DAC) designs have fueled RF SoC frontend innovations, such as direct-sampling of RF signals [4][5], and with the availability of processor core IP more commonplace in modern technology nodes, FPGA fabrics may not be most effective for on-chip baseband signal processing. While FPGAs are capable of meeting the high performance and reconfigurability requirements of SDRs, this comes at the expense of area and power dissipation due to their inherent structure. Furthermore, the powerup and reconfiguration latencies of FPGAs can be an issue in applications where wakeup time and agile tuning are required. To address these issues, taking advantage of advances in semiconductor scaling, application specific baseband processing, typically accomplished by the FPGA, may be more efficiently accomplished through on chip digital signal processing (DSP) techniques.

PHASE I:
The purpose of the Phase I effort is to: determine the feasibility (cost, schedule, and performance) related to the development of the SDR-SoC; identify technological issues (availability of IP, etc.) to be addressed through innovations; and, develop a Phase II proposal for development activities toward the realization of the SDR-SoC from design to qualification, with the goal of meeting the following performance specifications:

General:
a. Off-state leakage current: < 10 µA
b. Sleep conditions must be specified along with projected sleep current.
c. Active power management with ability to shutdown/sleep system components.
d. Peak Power Dissipation: < 1 W
e. Chip area: 5mm x 5mm

RF Frontend:
a. Transceiver tuning range: 100 MHz to 6 GHz
b. At least 1 Tx and 1 Rx channel
c. 40MHz Bandwidth
d. Anti-aliasing filters
e. 14 bit DAC, 14 bit ADC or better

DSP Engine
a. Programmable DSP module floating point, 64 MACs running at 500MHz or better.

Processor Core:
a. ARM Cortex9 or better or RISC-V core with support for bootable Linux OS, external storage, UART, USB, I2C, SPI, network interfaces.
Furthermore, due to national security considerations use of the Trusted Foundry Program, foundry access (GLOBALFOUNDRIES) is preferred, but not required, when considering feasibility options. Phase-I feasibility study should also be inclusive of programming software for the DSP module and front end, and supported Linux distribution(s).

PHASE II:
The purpose of the Phase II effort is to: execute on the Phase I proposed developmental activities and innovations needed to advance the SDR-SoC concept; demonstrate working silicon; and develop a Phase III product commercialization plan, including potential non-DoD customers. Phase II outcomes will result in the design for fabrication, fabrication, package, assembly, test, qualification, and delivery of functional prototypes, including supporting design development and user-required software, data, and documentation, of either the fully functional monolithic SDR-SoC or necessary IP block(s) required to advance the monolithic SDR-SoC concept.

PHASE III DUAL USE APPLICATIONS:
The objective of Phase III effort is to pursue commercialization objectives resulting from the Phase II developments. Other Phase III activities may include follow-on non-SBIR/STTR funded R&D (ATSP, OTA) or production contracts for developed products intended for use by the DoD.
 
  • Fire
  • Like
Reactions: 3 users

SBIR Phase I:Artificial Intelligence (AI)-Aided Part Identification for Coordinate Measuring Machines​

Award Information
Agency:National Science Foundation
Branch:N/A
Contract:2222967
Agency Tracking Number:2222967
Amount:$274,536.00
Phase:phase I
Program:SBIR
Solicitation Topic Code:M
Solicitation Number:NSF 22-551
Timeline
Solicitation Year:2022
Award Year:2023
Award Start Date (Proposal Award Date):2023-01-15
Award End Date (Contract End Date):2023-10-31
Small Business Information
OPTIC FRINGE CORP.
8 Cobblestone Way
Billerica, MA 01862
United States
DUNS:N/A
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: BING ZHAO
Phone: (857) 636-0962
Email: bing.zhao2016@gmail.com
Business Contact
Name: BING ZHAO
Phone: (857) 636-0962
Email: bing.zhao2016@gmail.com
Research Institution
N/A
Abstract
The broader impact of this Small Business Innovation Research (SBIR) Phase I project is the development of a new generation of smart machines used in the measurement of parts and assemblies. The team has demonstrated that this technology can convert existing coordinate measuring machines to self-driving autonomous machines. The ability to automatically measure parts is an important feedback link in the process chain that will enable fully automated manufacturing of the future. Specifically, this automation will reduce the specialized skill required to use a Coordinate Measuring Machine (CMM). The innovation will enable workers to operate a CMM and get a precise part measurement. This device is especially helpful as the skilled manufacturing/metrology workforce is retiring as it gives new employees the ability to provide accurate information with little/no training. This innovation also gives the manufacturing companies an option to buy a new machine or upgrade their existing coordinate measuring machine. While the focus of this proposal is part identification, this technology has ready applications in Computer Numerical Control (CNC) machining, robotics, and automated assembly lines. This capability will make the US manufacturing sector stronger and more technologically advanced._x000D_
_x000D_
The objective of this proposal is to develop a new technology to identify machined parts and assemblies. This technology will be implemented on coordinate measuring machines (CMM), which are used widely in the manufacturing sector to measure the shape and size of parts. The proposed technology will enable autonomous measurements of parts allowing a higher level of automation. In this identification technology, the team will use live images from a camera, multiple solid model/Computer Aided Design (CAD)-generated images, and advanced image processing. Applying Artificial Intelligence (AI)/Machine Learning (ML) to the image processing of part images will ensure correct part identification. Correct identification of parts as seen by the camera is the remaining unsolved challenge to achieving self-driven automatic measurements of parts. Most machined parts are textureless and most of the information is contained in the edges. Current image processing techniques work well with texture-rich parts but are unreliable with textureless machined parts. AI/ML enhanced image processing using edge and shape information is a promising approach, solving this problem will lead to the birth of a new generation of CMMs that can measure parts automatically._x000D_
_x000D_
This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.
 
  • Fire
  • Like
Reactions: 3 users

Novel AI Hardware Technologies (e.g.: Neuromorphic Computing, High-performance Technologies for AI, Smart and Secure Edge Devices, etc.)​

Agency:
National Science Foundation
Branch:
N/A
Program | Phase | Year:
BOTH | Phase II | 2023
Solicitation:
NSF 23-516
Topic Number:
AI5
NOTE: The Solicitations and topics listed on this site are copies from the various SBIR agency solicitations and are not necessarily the latest and most up-to-date. For this reason, you should use the agency link listed below which will take you directly to the appropriate agency server where you can read the official version of this solicitation and download the appropriate forms and rules.
The official link for this solicitation is:https://www.nsf.gov/pubs/2023/nsf23516/nsf23516.htm
Release Date:
November 22, 2022
Open Date:
November 22, 2022
Application Due Date:
March 01, 2023
July 05, 2023
March 04, 2024
Close Date:
March 04, 2024
Description:
NA
 
  • Fire
  • Like
Reactions: 3 users
Not had a look in a long time using this search feature for nasa and here are the current active and completed neuromorpic projects

You need to type Neuromorphic in the search bar

 
  • Like
  • Love
Reactions: 4 users

hotty4040

Regular
Not had a look in a long time using this search feature for nasa and here are the current active and completed neuromorpic projects

You need to type Neuromorphic in the search bar

What a great read ( techport ) I've had so far, will continue later. I'm sure Akida ( BRAINCHIP ) will be able to assist in many of these mentioned technology requirements moving forward.

Thanks for this post Pom, an absolute winner.

Akida Ballista >>>>> NASA - Neuromorphic - BRAINCHIP, what a nice blend IMO <<<<<

hotty...
 
  • Like
  • Fire
  • Love
Reactions: 3 users
Top Bottom