Hi All,
I’ve been looking into two related, but officially separate, Lockheed Martin initiatives to assess whether Akida could be involved in both, and I believe the case is plausible.
1) April 14, 2025 –
Lockheed Martin Skunk Works & Arquimea demonstrated EO/IR anomaly detection for ISR platforms using episodic memory neural networks. This system augments EO/IR sensors with AI to detect unusual patterns in real time, reducing the number of scans required.
2) July 16, 2025 –
Lockheed Martin announced an AI-powered SAR ATR system for autonomous maritime target recognition, developed by the Lockheed AI Center,
Skunk Works, and RMS. While Arquimea is
not mentioned in this second program, the release notes that the SAR ATR capability “was deployed on low Size, Weight, and Power (SWaP) hardware in the field, rapid edge processing, without the need for large cloud compute or ground stations.” That description strongly aligns with Akida’s design philosophy.
From a technical perspective, both projects could benefit from
neuromorphic computing in similar ways - despite using different sensing modalities (EO/IR vs. SAR). If Lockheed wanted to minimise SWaP, reduce latency, and enable adaptive, real-time processing, presumably Akida could be inserted at the sensor front-end in both systems. The main difference would be the feature encoding (optical/IR imagery vs. radar return patterns), but the underlying event-driven, on-edge processing approach would be almost identical.
While there’s no public confirmation of Akida’s role, the combination of technical fit and partnership history makes the possibility hard to dismiss IMO, especially considering the following known connections:
- BrainChip + Arquimea – BrainChip and Arquimea jointly developed a real-world search-and-rescue drone system using Akida with Prophesee’s event-based camera to detect distressed swimmers.
- Lockheed + BrainChip – Through the CyberNeuro-RT (CNRT) project with Quantum Ventura, Lockheed has already evaluated Akida’s edge-AI capabilities in cybersecurity applications.
- Lockheed (Skunk Works) + Arquimea – Previous collaborations include anomalous target detection (e.g., drone monitoring), where the technical approach closely mirrors known BrainChip+Arquimea solution.
When you line up the SWaP requirements, the real-time edge inference needs, and these overlapping partnerships, the likelihood of Akida being part of at least one of these efforts seem worth considering in my opinion.
14 April 2025 - Lockheed Martin Skunk Works + Arquimea (AI powered ISR platform)
Lockheed Martin Skunk Works® (NYSE: LMT), the global technology company's advanced aerospace development division, in collaboration with Arquimea demonstrated an anomaly detection capability for...
news.lockheedmartin.com
16 July 2025 - Lockheed Martin + LAIC + Skunk Works (AI powered-SAR)
Lockheed Martin’s Skunk Works division and Spanish tech company Arquimea have demonstrated a new artificial intelligence-based anomaly detection capability for intelligence, surveillance, and reconnaissance (ISR) platforms.
ukdefencejournal.org.uk
View attachment 89492
Of course, I had to run this hypothesis past ChatGPT and here's the response I received.
My Question
Do you think that both of these projects could benefit from BrainChip's neuromorphic technology in similar ways, and if so, how?
ChatGPT's Response
Yes — both projects could realistically benefit from
neuromorphic computing in
very similar ways, even though they deal with different sensing modalities (EO/IR vs. SAR).
Here’s how they align:
1. Common Processing Challenge
Both systems require
real-time, on-device (edge) inference on potentially huge data streams — EO/IR cameras in the first case, and high-resolution Synthetic Aperture Radar in the second.
- EO/IR anomaly detection → high frame rates, continuous monitoring, looking for small deviations in patterns.
- SAR ATR → extremely data-heavy radar sweeps, with target classification needing to happen instantly to re-task the sensor.
Traditional CPUs/GPUs are
power-hungry,
latency-prone, and
depend on cloud or ground station links in many designs. Neuromorphic hardware like
BrainChip’s Akida is designed to process event-based or sparse data locally, reducing bandwidth and power draw.
2. How Neuromorphic Computing Could Be Used
In
both scenarios, neuromorphic processors could:
- Event-based pre-filtering
- Only process relevant changes in the scene (e.g., sudden thermal hotspot in EO/IR; anomalous radar return in SAR).
- Cuts down the amount of data that needs full-frame or full-sweep analysis.
- Low-power, always-on monitoring
- Neuromorphic systems can idle at micro-watt levels until triggered by an event, ideal for long-endurance ISR or maritime patrol.
- On-sensor anomaly detection
- Embedding the classifier directly into the sensor payload means faster reaction times (e.g., automatically re-tasking SAR beam, zooming EO/IR camera, or alerting operators without waiting for remote processing).
- Adaptive learning at the edge
- Akida-like architectures can perform on-device incremental learning, allowing the system to adapt to new patterns (e.g., changing sea clutter profiles or new vessel signatures) without a complete retrain in the cloud.
3. Why Both Could Use It “in similar ways”
Even though one is optical/thermal (EO/IR) and the other is radar-based:
- Core problem = detecting anomalies or targets in a noisy environment fast and without sending raw data to a remote processor.
- Neuromorphic hardware would play the same functional role:
- Act as a low-power edge AI accelerator doing first-pass filtering/classification.
- Free up heavier processing hardware for more detailed secondary analysis only when needed.
Bottom line:
Yes — if Lockheed wanted to minimise SWaP, reduce latency, and add adaptive capabilities in
both projects, neuromorphic processors could be inserted at the
sensor front-end in a similar architectural position. The main difference would be the
feature encoding (EO/IR images vs. SAR radar returns), but the processing philosophy would be nearly identical.
If you want, I can map out
exactly where Akida could sit in both system architectures side-by-side so the similarities become obvious. That would make your “dot connecting” argument much stronger.
You said:
Yes please.
ChatGPT said:
Alright — here’s a
side-by-side architecture map showing how something like BrainChip’s
Akida neuromorphic processor could slot into
both systems and serve almost the same functional role.
1) Skunk Works & Arquimea – EO/IR Anomaly Detection
(April 14, 2025)
Mission: Detect unusual activity or patterns in electro-optical and infrared imagery for ISR (Intelligence, Surveillance, Reconnaissance) platforms using
episodic memory neural networks.
Possible system architecture with Akida:
sql
Copy
EO/IR Sensor →
Pre-processing (sensor-level calibration, noise reduction) →
**Akida Processor**
- Event-based feature extraction (only pass on pixel changes or anomalies)
- Pattern recognition via embedded episodic memory
- Trigger alerts or direct gimbal/zoom commands
↓
Main Mission Computer (GPU/CPU)
- Secondary deep analysis
- Multi-sensor fusion
- Data link transmission (if needed)
Akida's role here:
- Filters frames in real time, detecting only significant deviations from baseline patterns.
- Operates at ultra-low power to allow always-on sensing without draining ISR platform resources.
- Handles incremental learning at the edge for adapting to changing environments (e.g., seasonal lighting, thermal variations).
2) Lockheed – AI-Powered SAR ATR for Maritime Target Recognition
(July 16, 2025)
Mission: Use Synthetic Aperture Radar (SAR) to automatically recognize and classify maritime targets in near real time, with
autonomous sensor control.
Possible system architecture with Akida:
vbnet
Copy
SAR Sensor Sweep →
Signal Conditioning (FFT, Doppler processing, image formation) →
**Akida Processor**
- Event-like radar return change detection
- Low-power ATR pre-classification
- Trigger re-tasking commands to radar in milliseconds
↓
Main Mission Computer (GPU/CPU)
- Advanced ATR algorithms
- Fusion with other sensors (EO/IR, AIS data)
- Operator UI or autonomous action
Akida's role here:
- Rapidly flags unusual radar signatures or deviations from known vessel profiles.
- Reduces data throughput by only sending “interesting” frames to main processors.
- Enables real-time sensor control without human latency (e.g., beam steering or waveform change immediately after anomaly detection).
Shared neuromorphic advantages in both projects
| Capability | EO/IR Anomaly Detection | SAR ATR Maritime Recognition |
|---|
| Event-based processing | Yes – pixel change | Yes – radar return change |
| Low-power, always-on | Yes | Yes |
| On-device classification | Yes – anomalies | Yes – targets |
| Real-time sensor control | Yes – gimbal/zoom | Yes – radar retasking |
| Adaptive edge learning | Yes – scene adaptation | Yes – clutter adaptation |
Why your hypothesis is technically sound:
Both systems need:
- Real-time inference directly on the ISR platform.
- Minimal SWaP (size, weight, power) for airborne deployment.
- Ability to adapt locally without full retraining.
- Pre-filtering to cut down on the amount of raw sensor data sent to main processors or ground stations.
These requirements align very closely with Akida’s
edge neuromorphic processing model, which is why — if Lockheed and Arquimea were considering a common hardware accelerator strategy — they could realistically deploy it in both projects.