Pom down under
Top 20
Still a guessing game when it comes to Nanose but at least we know it will be another 2 years until a device is released with or without our technology
Mar 18, 2026 9:00 AM Eastern Daylight Time
ForwardEdge ASIC Selects BrainChip’s Neuromorphic Computing for Future ASICs
Share
First Step in a Strategic Collaboration for Cognitive Sensing Solutions in RF and Signal Processing
LAGUNA HILLS, Calif.--(BUSINESS WIRE)--BrainChip Holdings Ltd. (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), a leading developer of ultra-low-power, fully digital, event-based neuromorphic AI, today announced a strategic collaboration with ForwardEdge ASIC, a wholly owned subsidiary of Lockheed Martin (NYSE: LMT) specializing in advanced ASIC architecture and microelectronics development.
Together, the companies are combining best-in-class AI technology with deep ASIC and system integration expertise to deliver differentiated edge-processing solutions for demanding aerospace, defense, and advanced technology markets.
A Strategic Collaboration for Intelligent Edge Systems
This collaboration brings together BrainChip’s leadership in neuromorphic AI architecture and ForwardEdge ASIC’s strengths in custom silicon development, heterogeneous integration, and advanced RF systems. The collaboration is focused on tightly coupling AI acceleration with signal processing and RF compute to enable high-performance, low-latency intelligence at the edge.
By embedding BrainChip’s neuromorphic AI engines directly into ForwardEdge ASIC’s architectures, the companies are enabling cognitive processing closer to the sensor—reducing data movement, lowering power consumption, and enabling autonomous operation in complex operational environments.
“ForwardEdge ASIC is focused on architecting and delivering highly integrated silicon solutions that leverage the most advanced technologies available,” said Bill Jenkins, CRO at ForwardEdge ASIC. “BrainChip’s Akida architecture is a strong complement to our ASIC and RF platforms, allowing us to integrate dedicated AI acceleration directly into the silicon. This collaboration enables us to deliver scalable, high-performance low-latency edge solutions that push intelligence closer to the point of sensing.”
Enabling a New Class of Cognitive Sensing Solutions
The joint solution leverages a heterogeneous architecture that combines custom ASIC processing, RF signal chains, and neuromorphic AI acceleration to deliver efficient, real-time detection and classification. Key capabilities include:
This collaboration allows both companies to accelerate innovation while reducing integration risk for customers seeking production-ready edge AI solutions.
- Cognitive RF and Signal Processing: Real-time classification of complex signals with adaptive, AI-driven processing.
- Scalable ASIC Platforms: Architectures designed for reuse across multiple programs and deployment environments.
- Efficient Edge Intelligence: Ultra-low-power AI processing optimized for latency-sensitive and resource-constrained systems.
“ForwardEdge ASIC brings deep system-level design and integration expertise that is essential for deploying neuromorphic AI in real-world applications,” said Steven Brightfield, CMO at BrainChip. “This collaboration demonstrates how BrainChip’s AI technology can be tightly integrated into advanced ASIC and RF platforms to deliver powerful, efficient intelligence at the edge.”
About BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY) BrainChip is the worldwide leader in Edge AI on-chip processing and learning. The company’s first-to-market, fully digital, event-based AI processor, Akida™, uses neuromorphic principles to mimic the human brain, analyzing only essential sensor inputs at the point of acquisition and processing data with unmatched efficiency, precision, and energy economy. BrainChip’s Temporal Event-based Neural Networks (TENNs) build on State-Space Models (SSMs) with time-sensitive, event-driven frameworks that are ideal for real-time streaming applications. These innovations make low-power Edge AI deployable across industries such as aerospace, autonomous vehicles, robotics, industrial IoT, consumer devices, and wearables. Explore more at www.brainchip.com. Follow BrainChip on Twitter or LinkedIn.
About Forward Edge Forward Edge is focused on developing advanced microelectronics and edge processing solutions for leading edge commercial applications and next-generation defense capabilities. Explore more at www.forwardedgeasic.com.
Contacts
BrainChip Media Contact:
Madeline Coe
prforbrainchip@bospar.com
224-433-9056
BrainChip Investor Contact:
ir@brainchip.com
16.03.2026
![]()
ForwardEdge ASIC Selects BrainChip’s Neuromorphic Computing for Future ASICs
Together, the companies are combining best-in-class AI technology with deep ASIC and system integration expertise to deliver differentiated edge-processing...semiiphub.com
What does it really mean Kevin testing Akida 1000, ?Hi WH,
There is great potential for some sort of sublicence arrangement here.
$11B for what is, compared to Akida, a one trick pony software application. It seems that it is useful in sorting RAG models:
https://www.confluent.io/compare/apache-kafka-vs-confluent/
View attachment 96298
https://www.geeksforgeeks.org/apache-kafka/apache-kafka/
Apache Kafka is a publish-subscribe messaging system. A messaging system lets you send messages between processes, applications, and servers. Broadly Speaking, Apache Kafka is software where topics (a topic might be a category) can be defined and further processed. Applications may connect to this system and transfer a message onto the topic. A message can include any kind of information from any event on your blog or can be a very simple text message that would trigger any other event.
https://ai-academy.training/2025/07/13/live-retrieval-augmented-generation-rag-with-kafka/
Confluence and Kafka are birds of a feather:
https://www.confluent.io/resources/online-talk/cloud-demo/
Online Talk
Confluent Cloud Demo: Easily Stream Data With Apache Kafka®
Live Retrieval-Augmented Generation (RAG) with Kafka
The buzz around Generative AI (GenAI) is undeniable. From crafting compelling marketing copy to generating intricate code, Large Language Models (LLMs) are rapidly transforming workflows. However, beneath the surface of impressive text generation lies a critical challenge: reliability and factual accuracy. Standalone LLMs, trained on vast but static datasets, can hallucinate or provide outdated information, limiting their utility in enterprise environments where truth and timeliness are paramount.
Enter Retrieval-Augmented Generation (RAG).
This powerful paradigm enhances LLMs by equipping them with the ability to access and incorporate external, up-to-date information into their responses. Think of it as giving the LLM a real-time open-book test, ensuring its answers are grounded in verifiable data. But how do we make this “open book” truly dynamic, reflecting the ever-changing landscape of enterprise knowledge? The answer lies in the real-time streaming capabilities of Apache Kafka.
Kafka/Confluence could be used to advantage with Akid. BRN's Provenance is also suited to dehallucinating medium sized models.
So, if Confluence is worth $11B, ... ?
Given Kevin's exploits, as well as the synergies between Confluence/Kafka and Akida for RAG, this could be a live question.
Does Sean have Arvind Krishna's phone number?