BRN Discussion Ongoing

  • Like
  • Thinking
  • Fire
Reactions: 20 users

cosors

👀
  • Love
  • Like
Reactions: 2 users

Boab

I wish I could paint like Vincent
  • Like
  • Fire
  • Love
Reactions: 44 users

charles2

Regular
For anyone interested, here is another article about the KAIST neuromorphic chip breakthrough, and how they're comparing it to a Nvidia chip:

God forbid Brainchip make a news release like this...emphasizing patents. Ours!!

Might generate the feared speeding ticket. Might plead down to a citation for jaywalking after serious negotiation.

Collateral damage: The share price would quadruple.

Oh the horrors. Oh the horrors.
 
  • Haha
  • Fire
  • Like
Reactions: 20 users

Evermont

Stealth Mode
Thanks for sharing @Pmel
Surely this is huge news. Here is a CEO of a company that is spruiking the benefits of AKIDA ability to help with amazingly advanced radar and RF applications.
They also work with the DoD. Just saying.

Agree @Boab this is a huge endorsement for BrainChip. Nice pick-up @Pmel

Surprised there was not a "this release has been approved by the DoD" attached at the end.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Boab

I wish I could paint like Vincent
God forbid Brainchip make a news release like this...emphasizing patents. Ours!!

Might generate the feared speeding ticket. Might plead down to a citation for jaywalking after serious negotiation.

Collateral damage: The share price would quadruple.

Oh the horrors. Oh the horrors.
Not sure if you've read the white paper that was produced calling for benchmarks in edge AI inference.
I know its not up in the bright lights but is does compare us to Nvidia and Google Coral.
Cheers
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 15 users

IloveLamp

Top 20
1000014014.jpg




AND WHEN YOU OPEN THE LINK THIS IS THE FIRST THING YOU SEE........



1000014018.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 71 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi GS
Not sure if it has been disclosed though some of it seems familiar but well worth putting up the extract. Great find either way generously shared:

“MENTAT​

Award Information
Agency:Department of Defense
Branch:Navy
Contract:N68335-22-C-0158
Agency Tracking Number:N202-099-1097
Amount:$1,107,499.00
Phase:phase II
Program:SBIR
Solicitation Topic Code:N202-099
Solicitation Number:20.2
Timeline
Solicitation Year:2020
Award Year:2022
Award Start Date (Proposal Award Date):2021-12-16
Award End Date (Contract End Date):2024-02-24
Small Business Information
BLUE RIDGE ENVISIONEERING, INC.
5180 Parkstone Dr. Suite 200
Chantilly, VA 20151-1111
United States
DUNS:616396953
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Andrew Krause
Phone: (571) 349-0900
Email: andrew@br-envision.com
Business Contact
Name: Edward Zimmer
Phone: (703) 927-0450
Email: ned@br-envision.com
Research Institution
N/A
Abstract
Deep Neural Networks (DNN) have become a critical component of tactical applications, assisting the warfighter in interpreting and making decisions from vast and disparate sources of data. Whether image, signal or text data, remotely sensed or scraped from the web, cooperatively collected or intercepted, DNNs are the go-to tool for rapid processing of this information to extract relevant features and enable the automated execution of downstream applications. Deployment of DNNs in data centers, ground stations and other locations with extensive power infrastructure has become commonplace but at the edge, where the tactical user operates, is very difficult. Secure, reliable, high bandwidth communications are a constrained resource for tactical applications which limits the ability to routed data collected at the edge back to a centralized processing location. Data must therefore be processed in real-time at the point of ingest which has its own challenges as almost all DNNs are developed to run on power hungry GPUs at wattages exceeding the practical capacity of solar power sources typically available at the edge. So what then is the future of advanced AI for the tactical end user where power and communications are in limited supply. Neuromorphic processors may provide the answer. Blue Ridge Envisioneering, Inc. (BRE) proposes the development of a systematic and methodical approach to deploying Deep Neural Network (DNN) architectures on neuromorphic hardware and evaluating their performance relative to a traditional GPU-based deployment. BRE will develop and document a process for benchmarking a DNN’ s performance on a standard GPU, converting it to run on commercially available neuromorphic hardware, training and evaluating model accuracy for a range of available bit quantizations, characterizing the trade between power consumption and the various bit quantizations, and characterizing the trade between throughput/latency and the various bit quantizations. This process will be demonstrated on a Deep Convolutional Neural Network trained to classify Electronic Warfare (EW) emitters in data collected by AFRL in 2011. The BrainChip Akida Event Domain Neural Processor development environment will be utilized for demonstration as it provides a simulated execution environment for running converted models under the discrete, low quantization constraints of neuromorphic hardware. In the option effort we pursue direct Spiking Neural Network (SNN) implementation and compare performance on the Akida hardware, and potentially other vendor’s hardware as well. We demonstrate the capability operating on real hardware in a relevant environment by conducting a data collection and demonstration activity at a U.S. test range with relevant EW emitters”

My opinion only DYOR
Fact Finder

Hi GS
Not sure if it has been disclosed though some of it seems familiar but well worth putting up the extract. Great find either way generously shared:

“MENTAT​

Award Information
Agency:Department of Defense
Branch:Navy
Contract:N68335-22-C-0158
Agency Tracking Number:N202-099-1097
Amount:$1,107,499.00
Phase:phase II
Program:SBIR
Solicitation Topic Code:N202-099
Solicitation Number:20.2
Timeline
Solicitation Year:2020
Award Year:2022
Award Start Date (Proposal Award Date):2021-12-16
Award End Date (Contract End Date):2024-02-24
Small Business Information
BLUE RIDGE ENVISIONEERING, INC.
5180 Parkstone Dr. Suite 200
Chantilly, VA 20151-1111
United States
DUNS:616396953
HUBZone Owned:No
Woman Owned:No
Socially and Economically Disadvantaged:No
Principal Investigator
Name: Andrew Krause
Phone: (571) 349-0900
Email: andrew@br-envision.com
Business Contact
Name: Edward Zimmer
Phone: (703) 927-0450
Email: ned@br-envision.com
Research Institution
N/A
Abstract
Deep Neural Networks (DNN) have become a critical component of tactical applications, assisting the warfighter in interpreting and making decisions from vast and disparate sources of data. Whether image, signal or text data, remotely sensed or scraped from the web, cooperatively collected or intercepted, DNNs are the go-to tool for rapid processing of this information to extract relevant features and enable the automated execution of downstream applications. Deployment of DNNs in data centers, ground stations and other locations with extensive power infrastructure has become commonplace but at the edge, where the tactical user operates, is very difficult. Secure, reliable, high bandwidth communications are a constrained resource for tactical applications which limits the ability to routed data collected at the edge back to a centralized processing location. Data must therefore be processed in real-time at the point of ingest which has its own challenges as almost all DNNs are developed to run on power hungry GPUs at wattages exceeding the practical capacity of solar power sources typically available at the edge. So what then is the future of advanced AI for the tactical end user where power and communications are in limited supply. Neuromorphic processors may provide the answer. Blue Ridge Envisioneering, Inc. (BRE) proposes the development of a systematic and methodical approach to deploying Deep Neural Network (DNN) architectures on neuromorphic hardware and evaluating their performance relative to a traditional GPU-based deployment. BRE will develop and document a process for benchmarking a DNN’ s performance on a standard GPU, converting it to run on commercially available neuromorphic hardware, training and evaluating model accuracy for a range of available bit quantizations, characterizing the trade between power consumption and the various bit quantizations, and characterizing the trade between throughput/latency and the various bit quantizations. This process will be demonstrated on a Deep Convolutional Neural Network trained to classify Electronic Warfare (EW) emitters in data collected by AFRL in 2011. The BrainChip Akida Event Domain Neural Processor development environment will be utilized for demonstration as it provides a simulated execution environment for running converted models under the discrete, low quantization constraints of neuromorphic hardware. In the option effort we pursue direct Spiking Neural Network (SNN) implementation and compare performance on the Akida hardware, and potentially other vendor’s hardware as well. We demonstrate the capability operating on real hardware in a relevant environment by conducting a data collection and demonstration activity at a U.S. test range with relevant EW emitters”

My opinion only DYOR
Fact Finder
Thanks to @Fact Finder and @GStocks123 for sharing the above information.

I don't think the article below has been posted previously. As per the above, this discusses the use of AI to identify and detect emitters and to identify the proper technique to counter the threat. Cognitive EW Systems with real-time learning cannot be distracted like human operators.



A cryptologic technician monitors electronic warfare (EW) sensors on board the USS Makin Island (LHD-8). Advances in artificial intelligence and machine learning can enable cognitive EW systems—real-time learning and thinking systems that cannot be distracted like human operators.

A cryptologic technician monitors electronic warfare (EW) sensors on board the USS Makin Island (LHD-8). Advances in artificial intelligence and machine learning can enable cognitive EW systems—real-time learning and thinking systems that cannot be distracted like human operators.
U.S. NAVY (DOMINIC DELAHUNT)

Implement AI in Electromagnetic Spectrum Operations​


By Lieutenant Commander Brian P. Gannon, U.S. Navy
August 2023

Proceedings

Vol. 149/8/1,446


Artificial intelligence/machine learning (AI/ML) is at the forefront of warfare modernization efforts within the Department of Defense (DoD). AI/ML is a critical force multiplier in electronic warfare (EW) and can be an extremely effective tool when applied to areas such as signal recognition, emissions and signal control, emitter classification, threat recognition, and jamming identification.
However, after long conflicts in permissive electromagnetic environments, the United States has lost the competitive edge in EW and electromagnetic spectrum management. The Electromagnetic Spectrum Superiority Strategy (ESSS), published in October 2020, addresses the many challenges faced in ensuring, maintaining access to, using, and maneuvering in the electromagnetic spectrum. Moving forward, the United States must regain and maintain electromagnetic superiority to win a conflict against strategic competitors, such as China or Russia.
The ESSS identifies objectives to align electromagnetic spectrum capabilities and resources with national strategic policy and objectives. However, it is up to information warfare leaders to discover new ways to use emerging technologies to develop warfighting capabilities in the electromagnetic spectrum—mainly, through continually evolving AI/ML systems that will enable electromagnetic spectrum superiority in all warfare domains. From tactical situational awareness and management, threat recognition and classification, and emission- and signature-control tactics, to over-the-horizon targeting and nonkinetic fires using nonorganic EW capabilities, AI/ML will be an enormous advantage in force lethality.

AI/ML Integration

display quote


Current EW systems and tactics cannot quickly adapt to new emerging or dynamically advanced threats because of an overreliance on database libraries with predefined countermeasures.1 Human cognition limits military warfighting capability. AI/ML can analyze tasks quicker and more efficiently than human operators, acting as a force multiplier. Electromagnetic spectrum superiority has a direct effect on the information domain in combat. As such, AI/ML algorithmic data processing has become imperative in all areas of military operations, including in EW.
For EW receivers designed to detect and track adversary radar systems, rapid advances in radar technology using multimode systems have made identifying and processing traditional radar much more difficult. With AI/ML, radar identification and classification can allow operators more decision time to determine the best course of action and to counter adversary weapon systems.2

A significant subset of EW is electronic protection—actions taken to counter an adversary’s attempt to deny, degrade, deceive, or destroy friendly forces’ use of the electromagnetic spectrum.3 The U.S. military has operated in permissive environments for so long it often does not know if its forces are being jammed by an adversary or from its own forces by mistake. Unfortunately, its strategic competitors have become more proficient in EW—particularly electronic attack.

The Promise of Cognitive EW

To retain electromagnetic superiority in communication-denied environments, EW systems must be able to identify, classify, and isolate adversary emitter signals in a densely congested electromagnetic environment and then quickly identify the proper technique to counter the threat.
Cognitive systems take the contextual circumstances of their surroundings, factor in the uncertainty, and make recommendations or decisions autonomously. Using AI/ML to determine whether to jam a target is a critical component of cognitive EW. There are multiple steps leading to this decision: identifying the target, determining the jamming effect, and selecting the jamming techniques that will be most effective.4
Cognitive EW can aid in identifying jamming incidents, whether from hostile or benign electromagnetic interference. It also can aid in quicker signal recognition and classification. Cognitive EW can more quickly define angle of arrival to determine the direction of the threat in the case of directed energy, and ML can recommend EW countermeasures.
The AI/ML technique proposed for EW integration is commonly referred to as Q-learning algorithms, a type of algorithmic reinforcement learning in AI/ML. Nearly every AI/ML system uses many models and massive amounts of data to train. Incredibly complex tasks or models often require vast amounts of data to complete the task or model. Reinforcement learning overcomes this drawback by almost entirely removing the data requirement.5 Thankfully, most EW capabilities do not require complex models to integrate with AI/ML. Therefore, a model-free reinforcement learning algorithm will work with most cognitive capabilities necessary for EW.

Low-probability-of-intercept, low-probability-of-detection radar identification and designation is a minimally complex EW problem that Q-learning algorithms can solve. These algorithms can be integrated into EW systems to identify and classify signals to distinguish the proper techniques and courses of action to counter the threat. Even in a technologically advanced world, the foundation of EW is still the ability to properly identify a detected signal and associate that signal with an emitter on a specific platform and weapon system.
AI/ML also can vastly improve the technique of “fingerprinting” emitters, generally described as specific emitter identification. This technique is accomplished through a system’s analysis of the emitter’s parameters and the unintentional modulation on the pulse from the emitter. This provides an accurate picture of the threat environment and enables commanders to decide what course of action to choose. However, these emitter parameters often change over time or are switched among platforms. This is a growing concern with specific emitter identification.

Analysis and Findings

Digitally reprogramming frequency modulation and waveform in radar technology is an evolving trend. This means radars can change waveform modulation, creating new signatures depending on what mode is most advantageous based on the tactical situation. Waveform modulation is used to improve range resolution, increase contact signal return, reduce probability of intercept or detection, and support high-resolution imaging. With an increasingly congested radio-frequency environment, this will make adversary emitters harder to classify and identify.6
Consequently, cognitive EW has been an enormous focus in today’s application of ML. DARPA has done initial studies for radio-frequency systems, laying the foundation for goal-driven ML that can learn from data as it operates. The agency has developed novel algorithms and techniques of ML application to the radio-frequency spectrum operating environment. It also is developing spectrum-awareness tools that will not only help expand the capacity of spectrum resource management, but also improve spectrum sharing.7
In the past few years, DARPA has also started some simplified studies on sample EW problems. Researchers built an AI/ML convolutional neural network to recognize the modulation of an emitted signal. The algorithm was used to discern AM, FM, or phase-shift key modulations. The results were positive, revealing ML systems outperformed traditional signal characterization methods in every signal-to-noise ratio. The study speculated that ML systems could look at the radio-frequency spectrum to aid in a better understanding of the congested EM environment. This will be an immense force multiplier for military applications compared with traditional radio-frequency spectrum management.8
Beyond DARPA, L3 Harris is currently designing cognitive EW subsystems to respond in real time to unrecorded and unknown signal waveforms. The modules L3 Harris has designed are compact, self-contained signal-detection units that include a digital receiver, digital radio-frequency memory, and digital signal processor capable of identifying signals from a database of known waveforms, classifying unknown waveform signal collection, and producing jamming recommendations based on the signal parameters.9
Cognitive EW systems are real-time learning and thinking systems. They cannot be distracted like human operators. A cognitive EW system is constantly asking where it should be looking and what recommendations it should give. It will adjust tactics and techniques based on what it is learning. Implementing cognitive EW systems requires low-powered microprocessors and software tools able to direct AI/ML systems in signal processing and recognition and guide the systems’ thinking processes according to the algorithms built for signal analysis and processing.10 With the expansion of EW and radio-frequency spectrum tools and systems, cognitive EW systems will be the tactical edge in retaining electromagnetic spectrum superiority.
The most urgent emerging challenge facing the U.S. military is the requirement to conduct faster and more efficient decision-making than its competitors.11 Many AI/ML initiatives can do this. However, while there are many current AI/ML initiatives in the information warfare domain, they focus solely on the algorithms and not on the combat systems and operator integration. Private-sector AI/ML development is mostly dedicated to advancing autonomous decision-making instead of focusing on improving the capabilities of existing combat systems. The Navy must identify areas in which combat systems and information warfare systems can integrate. It can start with AI/ML integration for electronic warfare.

1. Cliff Drubin, “Cognitive EW Development Contract,” Microwave Journal 59 (July 2016): 44.
2. Tanner McWhorter et al., “Machine Learning Aided Electronic Warfare System,” IEEE Access 9 (June 2021): 94691–699.
3. Karen Zita Haigh and Julia Andrusenko, Cognitive Electronic Warfare: An Artificial Intelligence Approach (Norwood, MA: Artech House, 2021).
4. Huiqin Li et al., “Cognitive Electronic Jamming Decision-Making Method Based on Improved Q-Learning Algorithm,” International Journal of Aerospace Engineering (December 2021): 1–12.
5. Li et al., “Cognitive Electronic Jamming Decision-Making.”
6. Charlotte Adams, “Cognitive EW: RF Spectrum Meets Machine Learning,” Avionics Magazine, 2 August 2018.
7. Adams, “Cognitive EW.”
8. Adams.
9. Jack Browne, “Cognitive EW Provides Computer-Powered Protection,” Microwaves & RF, 10 May 2017.
10. Browne, “Cognitive EW Provides Computer-Powered Protection.”
11. CAPT Sam J. Tangredi and George Galdorisi, USN, AI at War: How Big Data, Artificial Intelligence, and Machine Learning Are Changing Naval Warfare (Annapolis, MD: Naval Institute Press, 2021).
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 21 users

IloveLamp

Top 20
1000014022.jpg
 
  • Like
  • Love
  • Fire
Reactions: 21 users
  • Like
  • Fire
  • Love
Reactions: 23 users

cosors

👀
Mr Franke is quite interested.

Nils Franke
Head of Resource- & Budgetmanagement at Deutsche Telekom AG
View attachment 58740
...
View attachment 58741

Nils, if you're among us here, would does Akida fit well into Telekom's budget? I would be happy with a hint in the private chat. 🤫😇😅
And don't just think of a German telephone provider. I don't know the figures, but I can imagine that the Telekom AG doesn't do that much money in Germany itself, but rather abroad. I think they are very strong in the USA - charles?
1709938322451.png

Screenshot_2024-03-08-23-52-46-24_40deb401b9ffe8e1df2f1cc5ba480b12.jpg

 
  • Like
  • Fire
  • Love
Reactions: 22 users

JoMo68

Regular
  • Like
  • Fire
  • Love
Reactions: 23 users

Euks

Regular
Hey Bravo,

These are all great endorsements and I believe they are from the March 2023 when the 2nd generation, Akida 2.0 was announced.

Our CEO said in his interview with stocks down under a few days ago that when our clients have our IP in there hands in can take several months up to a year for them
To make a decision……..

Now granted, I was lead to believe that Akida 2.0 was in the hands of early adopters in March 2023 as per the announcement that stated Akida 2.0 is in the hands of early adopters with general availability at the end of the 3rd quarter 2023.



I think most of us here were hoping when 2.0 was finally announced in October 2023 that we might have had a couple IP deals at the same time but unfortunately that wasn’t the case.

@supersonic001 has a point. At some stage you have to back up what you’re saying with results. In a number of 4Cs released well over 12 months ago the last paragraph said the same thing on numerous occasions. I can’t remember exactly but it was along the lines of “the company is experiencing its greatest number of customer engagements”

So once again if it only takes up
to a year for a customer to make a decision then what happened to all those customers?

There should be no excuses in this current environment where semiconductor companies are thriving and the US government is pumping money into the industry.

Just my opinion
Euks
 
  • Like
  • Love
  • Thinking
Reactions: 16 users

RobjHunt

Regular
  • Like
  • Love
Reactions: 4 users

Boab

I wish I could paint like Vincent
In todays Weekend Australian.
Go you good thing.
Weekend.jpg
 
  • Like
  • Love
  • Fire
Reactions: 77 users

Rach2512

Regular
  • Like
  • Fire
  • Love
Reactions: 15 users

TECH

Regular
Hey Bravo,

These are all great endorsements and I believe they are from the March 2023 when the 2nd generation, Akida 2.0 was announced.

Our CEO said in his interview with stocks down under a few days ago that when our clients have our IP in there hands in can take several months up to a year for them
To make a decision……..

Now granted, I was lead to believe that Akida 2.0 was in the hands of early adopters in March 2023 as per the announcement that stated Akida 2.0 is in the hands of early adopters with general availability at the end of the 3rd quarter 2023.



I think most of us here were hoping when 2.0 was finally announced in October 2023 that we might have had a couple IP deals at the same time but unfortunately that wasn’t the case.

@supersonic001 has a point. At some stage you have to back up what you’re saying with results. In a number of 4Cs released well over 12 months ago the last paragraph said the same thing on numerous occasions. I can’t remember exactly but it was along the lines of “the company is experiencing its greatest number of customer engagements”

So once again if it only takes up
to a year for a customer to make a decision then what happened to all those customers?

There should be no excuses in this current environment where semiconductor companies are thriving and the US government is pumping money into the industry.

Just my opinion
Euks

Nice post, I thought about Sean's comment you referenced, up to a year to commit or not to, despite being only a guide, it all comes back
to the fact that we can't control how or when another companies Board decide to pull the trigger, that really is the position that we find
ourselves in...the longer they wait to make the call, it will leave the door open for another innovative company to gain that first mover
advantage in my opinion.

Tech.
 
  • Like
  • Fire
  • Thinking
Reactions: 21 users

IloveLamp

Top 20
🤔


1000014025.jpg
 
  • Like
  • Thinking
  • Wow
Reactions: 11 users
Top Bottom