BRN Discussion Ongoing

Frangipani

Top 20
Quantum Ventura now has a wholly-owned subsidiary called MetaGuard AI (http://metaguard.ai/) to commercialise CyberNeuroRT. Srini Vasan, President and CEO of Quantum Ventura is also President & CEO of MetaGuard AI.

So it will technically be MetaGuard AI (and not Quantum Ventura as announced on the BrainChip CES 2026 website https://brainchip.com/ces-2026/) showcasing live demonstrations of CyberNeuroRT at BrainChip’s Venetian Tower Exhibit Suite in Las Vegas from January 6-9.


Jan 2, 2026 8:00 AM Eastern Standard Time

MetaGuard AI Debuts Defense-Grade Cybersecurity Platform at CES 2026 with Industry-First Full Source Code Access and BrainChip Neuromorphic Support

Share

Department of Energy (DOE) and Missile Defense Agency (MDA)-Funded AI-driven Cyberthreat management Platform Offers Zeek and Corelight Compatibility with Neuromorphic Edge Computing; Live Demonstrations at BrainChip Venetian Exhibit Suite 29-116, January 6-9; Available via Software Licensing or Managed Services.

SAN JOSE, Calif.--(BUSINESS WIRE)--MetaGuard AI, Inc., a wholly-owned subsidiary of Quantum Ventura Inc., will showcase CyberNeuroRT at the Consumer Electronics Show (CES) 2026, January 6-9 in Las Vegas. Attendees can experience live demonstrations of the Zeek and Corelight-compatible AI-driven real-time security platform.



CyberNeuroRT is the world’s first cybersecurity platform with a full source code licensing option for both enterprise deployments and neuromorphic edge versions, a unique combination in the cybersecurity market.

Developed through U.S. DOE (SBIR Phase 2) and MDA (SBIR Phase 1) federal funding, CyberNeuroRT brings defense-grade AI technology to commercial enterprises through flexible deployment options and unprecedented transparency. During the SBIR program, researchers from Penn State University’s Neuromorphic Computing Lab provided advanced neuromorphic ML models while cybersecurity specialists from the world’s largest Defense contractor provided a technical assessment of a custom CyberNeuroRT version in their air-gapped HPC environment.

"CES 2026 is the perfect venue to demonstrate how our Enterprise-scale ML models are adopted for ultra-efficient threat detection using BrainChip neuromorphic processors and how it transforms cybersecurity," said Srini Vasan, President & CEO at MetaGuard AI. “We invite attendees to visit us and see how CyberNeuroRT delivers in real-time."

"Bringing MetaGuard AI’s threat detection, CyberNeuroRT, solution to the efficient Akida platform shows how neuromorphic computing is reshaping cybersecurity from the edge to the enterprise,” said Sean Hehir, CEO of BrainChip.

CyberNeuroRT serves enterprises requiring AI-powered threat detection with transparency, including existing Zeek and Corelight users, regulated industries needing algorithmic accountability, government contractors, industrial IoT operators, and mid-market companies through managed services or flexible cloud/on-prem licensing models.

CyberNeuroRT Platform Capabilities:
Zeek and Corelight Integration:
Adds multi-model ML inference to existing Zeek and Corelight deployments without infrastructure replacement.

Neuromorphic Optimization: Optimized for BrainChip Akida processors for ultra-low power edge deployment in industrial IoT and distributed environments.

Federal Validation: Competitive SBIR awards from the Department of Energy and Missile Defense Agency validate defense-grade capabilities.

Flexible Licensing: Appointed resellers in the Japanese, Indian, and U.S. regions offer Enterprise Standard, Enterprise Ultra, ML-X Standalone, and around-the-clock Managed-SOC-Services in the U.S., Japan, and Dubai time zones, with other regional market opportunities in development.

Full Source Code Access: The Enterprise Ultra license includes complete source code access, all ML models, training data, and the ML-X platform for building custom threat detection models, providing unprecedented algorithmic accountability for regulated industries.

Transparency Pioneer Program: The first 100 Enterprise Ultra customers receive up to 50% off licensing fees plus priority support. For details, contact inquiries@metaguard.ai. Conditions apply.

MetaGuard AI at CES 2026:
  • Where: Venetian Campus, BrainChip Exhibit Suite 29-116
  • When: January 6, 7, & 9, 2026
  • Demonstrations: 8 a.m. - 2 p.m. PT
  • Schedule meetings:oncehub.com/CNRTatCES or +1 (855) 246-5422
For product information, partnership inquiries, or managed service details, visit www.metaguard.ai or contact them at +1 (855) 246-5422 or inquiries@metaguard.ai.

About MetaGuard AI, Inc.: MetaGuard AI, Inc. is a cybersecurity company commercializing AI-driven threat detection developed with U.S. Department of Energy and Missile Defense Agency SBIR funding, Penn State research, and advisory support from the world’s largest Defense contractor. Its CyberNeuroRT platform offers Zeek and Corelight-compatible network threat detection via software licensing or managed services. For more information, visit www.metaguard.ai.

Contacts​

Media Contact: Aaron Goldberg, SVP, MetaGuard AI, Inc.; Aaron@metaguard.ai, Direct +1 (408) 568-2157



View attachment 93987

View attachment 93988 View attachment 93989 View attachment 93990
View attachment 93992 View attachment 93993 View attachment 93994
View attachment 93991



Quantum Ventura also have an office in Dubai now, where they are know as Quantum Guru (“Pioneering the future of AI, cybersecurity, and advanced technologies from Dubai to the world.”)


View attachment 93995


View attachment 93996




By the way, Lockheed Martin is not named as a partner for the government-funded CyberNeuroRT project, only Penn State is.

Instead, cybersecurity specialists from Lockheed Martin merely had an advisory role, cf. today’s Business Wire article above:

“Developed through U.S. DOE (SBIR Phase 2) and MDA (SBIR Phase 1) federal funding, CyberNeuroRT brings defense-grade AI technology to commercial enterprises through flexible deployment options and unprecedented transparency. During the SBIR program, researchers from Penn State University’s Neuromorphic Computing Lab provided advanced neuromorphic ML models while cybersecurity specialists from the world’s largest Defense contractor provided a technical assessment of a custom CyberNeuroRT version in their air-gapped HPC environment.”

and

“MetaGuard AI, Inc. is a cybersecurity company commercializing AI-driven threat detection developed with U.S. Department of Energy and Missile Defense Agency SBIR funding, Penn State research, and advisory support from the world’s largest Defense contractor.


View attachment 93997

RTX engineers checking out the live demo of MetaGuard AI’s* CyberNeuroRT in BrainChip’s private suite at the Venetian Tower:

*the subsidiary Quantum Ventura set up to commercialise CyberNeuroRT, see https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-480130

View attachment 94105

On the left is Senior Principal Software Engineer Sylvia Traxler, who already visited the BrainChip team during CES 2025.

View attachment 94109

As I mentioned in my November post below, she is also one of the Raytheon mentors for the intermural Raytheon Autonomous Vehicle Contest (AVC), whose 2025/26 round BrainChip is sponsoring for the first time.





The gentleman in the middle wearing a white shirt is Samuel Young - he is a Systems Sustainment Engineer at Raytheon/ RTX.


View attachment 94108


3A6E237E-DC67-4CD7-A10F-CC10D267DDC0.jpeg



Abhronil Sengupta, Associate Professor of EECS at Penn State, and his PhD student Malyaban Bal collaborated with Quantum Ventura on CyberNeuroRT and were co-authors of the 2022 paper “Cyber-Neuro RT: Real-time Neuromorphic Cybersecurity”:


ABC67F24-CE1D-4B13-B3B8-3BED3207A3ED.jpeg






Clicking on “Discover our Solution” on the MetaGuard AI website (https://metaguard.ai/) will take you to another website titled “CyberNeuro RT Documentation”:



CyberNeuroRT​

Real-Time AI-Driven Network Detection & Response​

CyberNeuroRT is an advanced Network Detection and Response (NDR) platform designed to provide real-time visibility, AI-powered threat detection, and rapid response across enterprise and multi-tenant environments.

By combining machine learning with neuromorphic spiking neural networks (SNNs), CyberNeuroRT enables security teams to detect known, unknown, and evolving cyber threats with low latency, high accuracy, and scalable performance.

CyberNeuroRT is built for organizations that require continuous network protection without increasing operational complexity.


Why CyberNeuroRT​

Modern cyberattacks are fast, adaptive, and increasingly difficult to detect using signature-based tools alone. Attackers blend into normal traffic, exploit timing gaps, and evade traditional defenses.

CyberNeuroRT addresses these challenges by focusing on behavior, timing, and intent, allowing threats to be detected as they emerge, not after damage occurs.

Key Benefits​

  • Real-time detection of active threats
  • Behavioral analysis beyond static signatures
  • Reduced alert fatigue through high-confidence detections
  • Scalable, multi-tenant architecture
  • Support for automated and analyst-driven response

What CyberNeuroRT Does​

CyberNeuroRT continuously analyzes network traffic and applies multiple AI techniques in parallel to identify malicious behavior and operational risk.

At a high level, the platform: 1. Observes network activity in real time
2. Applies AI-driven behavioral analysis
3. Correlates events across time and protocols
4. Delivers actionable intelligence to analysts
5. Supports rapid containment and response


AI-Powered Threat Detection​

CyberNeuroRT uses a combination of supervised and unsupervised machine learning models together with neuromorphic SNN inference to identify threats that bypass traditional security tools.

Instead of relying solely on signatures, CyberNeuroRT analyzes: - Traffic patterns
- Behavioral deviations
- Temporal correlations
- Communication intent

This approach enables reliable detection even when attackers modify tools, infrastructure, or techniques.


Threats CyberNeuroRT Is Designed to Detect​

CyberNeuroRT’s AI models are trained to recognize a wide range of real-world attack behaviors commonly observed in enterprise networks, including:

  • Ransomware activity
    Indicators such as anomalous encryption behavior, lateral movement, and command-and-control communication.
  • Credential and password-based attacks
    Brute-force attempts, credential misuse, and abnormal authentication patterns.
  • Network scanning and reconnaissance
    Probing behavior that often precedes exploitation and lateral movement.
  • Web application attacks
    Including Cross-Site Scripting (XSS) and injection-based techniques targeting application-layer vulnerabilities.
  • Backdoor and persistent access mechanisms
    Covert communication channels and unauthorized persistence behavior.
  • Denial-of-Service (DoS) and Distributed DoS (DDoS)
    Volumetric and behavioral traffic anomalies intended to disrupt services.
CyberNeuroRT detects these threats using behavioral analysis and AI inference, rather than static rule matching.

Detection capabilities continue to evolve through controlled model retraining and intelligence updates.


Neuromorphic AI for Low-Latency Detection​

CyberNeuroRT integrates neuromorphic spiking neural networks, inspired by biological neural systems, to enable:

  • Ultra-low latency inference
  • Energy-efficient processing
  • Resilient performance under high traffic volumes
This allows CyberNeuroRT to deliver real-time decisions while reducing compute and infrastructure overhead, making it suitable for on-prem, cloud, hybrid, and edge deployments.


Real-Time Visibility & Analyst Experience​

CyberNeuroRT provides a real-time analyst dashboard that enables teams to:

  • Monitor evolving threats continuously
  • Investigate suspicious activity with context
  • Access historical network data for forensic analysis
  • Re-run analysis on stored traffic when required
The interface is designed to reduce investigation time and improve situational awareness.


Automated & Analyst-Controlled Response​

CyberNeuroRT supports both automated remediation and human-in-the-loop response, allowing organizations to balance speed and operational control.

Response actions may include: - Traffic isolation
- Blocking malicious communication paths
- Analyst-approved containment workflows

This approach enables rapid threat containment while minimizing operational risk.


Built for Scale & Multi-Tenant Environments​

CyberNeuroRT is architected for enterprise and service-provider deployments, with strict isolation across tenants.

Key design principles include: - Tenant-level data separation
- Scalable streaming and analytics pipelines
- Predictable performance under high load

This makes CyberNeuroRT suitable for large enterprises, MSSPs, and distributed organizations.


Security & Zero Trust Alignment​

CyberNeuroRT is designed to integrate with Zero Trust architectures and modern security frameworks.

Core security principles include: - Strong authentication and authorization
- Encrypted communication
- Role-based access control
- Least-privilege enforcement

This reduces attack surface and supports secure remote and hybrid access models.


Who CyberNeuroRT Is For​

CyberNeuroRT is ideal for organizations that require high-confidence, real-time network defense, including:

  • Enterprise Security Operations Centers (SOC)
  • Managed Security Service Providers (MSSPs)
  • Critical infrastructure operators
  • Regulated industries
  • Organizations adopting Zero Trust architectures

What’s Next​

If you are new to CyberNeuroRT, the recommended next steps are:

  1. Review the Installation guide to deploy sensors and services
  2. Follow the Usage section to explore detection and dashboards
  3. Refer to API Documentation for integrations and automation
  4. Use Troubleshooting for common operational questions

CyberNeuroRT is designed to grow with your environment — delivering continuous visibility, intelligent detection, and confident response as threats evolve.
 

Attachments

  • F47D6104-2719-4152-817C-5B8E4CB67907.jpeg
    F47D6104-2719-4152-817C-5B8E4CB67907.jpeg
    533.8 KB · Views: 44
  • Like
  • Love
  • Fire
Reactions: 39 users

Quiltman

Regular
Why do you show this guy from Infineon? I cannot get the connection to the other part of your post.

Should have been clearer.
I was simply scrolling through who had "liked" or commented on the LinkedIn post from HaiLa on CES2026.

1768157510606.png
 
  • Like
  • Fire
  • Thinking
Reactions: 10 users

Euks

Regular
  • Like
  • Fire
  • Love
Reactions: 28 users
Getting real as you say....countdown on too...just saying :)


Definitive Contract​

PIID​

FA875025CB013

In Progress​

(28 days remain)

Unlinked Award

Awarding Agency​

Department of Defense (DOD)

Recipient​

BRAINCHIP, INC

23041 AVENIDA DE LA CARLOTA STE 250

LAGUNA HILLS, CA 92653-1545
UNITED STATES
Congressional District: CA-47
I'm not completely sure I'm reading this right but I think it has been modified to end date of 4/8/26.

https://www.usaspending.gov/award/CONT_AWD_FA875025CB013_9700_-NONE-_-NONE-

SC
 
  • Thinking
  • Like
Reactions: 4 users

Doz

Regular
Once upon a time reading dates was easy , not so much now ….

Note : 2/8/26 = 8th February 2026 ….


1768172979291.png
 
  • Like
  • Haha
  • Fire
Reactions: 19 users

White Horse

Regular
Been doing some exploring.
This pic from CES. "AFEELA"


1768174436139.png



Sony and Honda Form A Strategic Alliance In Mobility

 
  • Like
  • Fire
  • Wow
Reactions: 21 users

Mccabe84

Regular
A couple of line wipes going on atm. 8 million shares gone just like that
 
  • Like
  • Fire
  • Wow
Reactions: 14 users

Sotherbys01

Regular
I have a feeling we could go "pre NR" very soon......
 
  • Like
  • Wow
Reactions: 4 users
Nice GREEN start to the week
 
  • Like
Reactions: 8 users

7für7

Top 20
Been doing some exploring.
This pic from CES. "AFEELA"


View attachment 94237


Sony and Honda Form A Strategic Alliance In Mobility


I was watching this topic long time ago….
But in that case, I think it’s just a sponsor band.



IMG_9579.jpeg
 
  • Like
Reactions: 4 users

HopalongPetrovski

I'm Spartacus!
Just traders taking their pips atm I fear.
Need some significant news to get us moving properly.
 
  • Like
Reactions: 9 users

7für7

Top 20
Just traders taking their pips atm I fear.
Need some significant news to get us moving properly.
Later we’ll end up getting another speeding ticket, BrainChip will say they don’t know nothing, and then it’ll drop back to 18,xx cents again… the usual gamblers’ game. They exploit every situation… unfortunately.
 
Last edited:
  • Like
Reactions: 1 users

manny100

Top 20
I dont u understand Akida 3,
The only way I can is because 1000,1500 ,and 2 are flying ,
But where seeing nothing in regards to revenue yet, In Ces how was 3 recieved?
In saying all this A.I is just beginning so lets see
Hi Jacob, there is a fair time gap between deals with volume commitments and revenue.
Serious investors will not spend 1 second worrying about the past or revenue.
Serious investors will look to accumulate when they see real signs that contracts with volume are on the horizon.
2026 should see some real accumulation in anticipation of deals with volume commitments.
Markets are future orientated and contracts = future revenue.
Parsons have volume supply written into their contract. Bascom Hunter and RTX may emerge with volume orders at some stage.
Onsor say they expect to launch in 2026 and Cybersecurity is now commercial.
We know that there are engagements underway. We also know that AKIDA Cloud speeds up time from 'having a look' to Prototype and that the Hubs have pretrained models to 'spoon feed' Developers.
By the time recurring revenue is coming in the horse called 'Share Price' will have bolted and probably done half a dozen laps of the course.
That is why posters are saying 2027 should see action - deals with volume commitments. With luck we may even see something emerge later this year.
 
  • Like
Reactions: 11 users
Hi Jacob, there is a fair time gap between deals with volume commitments and revenue.
Serious investors will not spend 1 second worrying about the past or revenue.
Serious investors will look to accumulate when they see real signs that contracts with volume are on the horizon.
2026 should see some real accumulation in anticipation of deals with volume commitments.
Markets are future orientated and contracts = future revenue.
Parsons have volume supply written into their contract. Bascom Hunter and RTX may emerge with volume orders at some stage.
Onsor say they expect to launch in 2026 and Cybersecurity is now commercial.
We know that there are engagements underway. We also know that AKIDA Cloud speeds up time from 'having a look' to Prototype and that the Hubs have pretrained models to 'spoon feed' Developers.
By the time recurring revenue is coming in the horse called 'Share Price' will have bolted and probably done half a dozen laps of the course.
That is why posters are saying 2027 should see action - deals with volume commitments. With luck we may even see something emerge later this year.
Thanks for the detailed response, just a question how many shares would you consider being a decent shareholder, Do you know much re the share NVU
 

Diogenese

Top 20
Once upon a time reading dates was easy , not so much now ….

Note : 2/8/26 = 8th February 2026 ….


View attachment 94236
The most efficient way from a computer's point of view id YYYYMMDD. That saves a few Flops per date.
 
  • Like
Reactions: 2 users
Thanks for the detailed response, just a question how many shares would you consider being a decent shareholder, Do you know much re the share NVU
Hi JK, this is just my thoughts and not sure if it's the correct way to go about figuring what the average shareholder has, but I'm thinking there's roughly 2 billion shares issued and there's about 45,000 shareholders. So I'm thinking 2 billion/45,000 which is just under 45,000 shares for the average shareholder. So anything above that is above average.

If anyone can correct me on this, please do so.
 
  • Like
Reactions: 3 users

jrp173

Regular
Hi Jacob, there is a fair time gap between deals with volume commitments and revenue.
Serious investors will not spend 1 second worrying about the past or revenue.
Serious investors will look to accumulate when they see real signs that contracts with volume are on the horizon.
2026 should see some real accumulation in anticipation of deals with volume commitments.
Markets are future orientated and contracts = future revenue.
Parsons have volume supply written into their contract. Bascom Hunter and RTX may emerge with volume orders at some stage.
Onsor say they expect to launch in 2026 and Cybersecurity is now commercial.
We know that there are engagements underway. We also know that AKIDA Cloud speeds up time from 'having a look' to Prototype and that the Hubs have pretrained models to 'spoon feed' Developers.
By the time recurring revenue is coming in the horse called 'Share Price' will have bolted and probably done half a dozen laps of the course.
That is why posters are saying 2027 should see action - deals with volume commitments. With luck we may even see something emerge later this year.

manny100 has been spending to much time with FactFinder on the other site. All this nonsense about "serious" investors or "genuine" investors. What a load of nonsense..... Does manny100 think he/she is some sort of bloody oracle?
 
Last edited:
  • Fire
  • Like
Reactions: 2 users

Tothemoon24

Top 20
IMG_2027.jpeg


The Secret to Next-Gen XR: It’s Not About Megapixels, It’s About Architecture
We have reached a plateau in XR (Extended Reality) hardware. Adding more pixels or faster GPUs isn't solving the three biggest hurdles: Heat, Weight, and Motion Sickness. The solution isn't "more power"—it’s a fundamental shift in how we process data. We are moving from Video → Meaning.
🔍 The "Meaning-at-Source" Revolution
In traditional headsets, a camera takes a high-res video of your eye and sends it to the main processor. This is "dumb data." It wastes battery and creates heat.
The 2026 breakthrough is Layer 1 (L1) Computing. Instead of a video, the sensor itself (or a tiny dedicated ASIC) extracts the Gaze Vector.
Traditional: Sending 60 frames of video per second.
2026 Shift: Sending a simple coordinate (X, Y).
This is the "EyeChip" philosophy: Extraction happens at the silicon level.
🗺️ The 5-Layer Tech Stack (2026 Supplier Map)
To understand where the industry is heading, we must look at the layers of the hardware stack:
Layer 1: Meaning-at-Source (Eye ASICs)
Ganzin (Taiwan): Their new AURORA IIE is a game-changer. It’s an ASIC that consumes <20mW. It processes eye-tracking locally, so the main chip can stay cool.
BrainChip (USA): Using neuromorphic AI to track eyes without ever "seeing" an image—perfect for medical privacy.


IMG_2028.jpeg


IMG_2029.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 19 users
Hi JK, this is just my thoughts and not sure if it's the correct way to go about figuring what the average shareholder has, but I'm thinking there's roughly 2 billion shares issued and there's about 45,000 shareholders. So I'm thinking 2 billion/45,000 which is just under 45,000 shares for the average shareholder. So anything above that is above average.

If anyone can correct me on this, please do so.
Hi SFB, I think that in statistics, the word 'average' is a collective name for mean, median and mode. The idea of an average is to use one number to 'typify' all of the numbers in the group/population/sample.

You have correctly calculated the mean. The sum of all the shares held divided by the number of shareholders. However, in this case, I don't think it is the best number to describe the 'average' holding.

As of 28 April 2025, the top 50 shareholders held about 48% of the stock. If we then look at the other 44950 shareholder who own the other 52% (or about 1.04 billion) the average (mean) shareholding is about 23000 shares.

I think that in this case the median shareholding is probably the best indicator as it is not skewed as much by the atypical holdings. This is why you always hear about median house prices and not average (meaning mean) house prices. A few big ones skew the results away from 'typical'.

I think that the best way to answer your own question is to look at the following table from the last annual report, put your hand on your tummy and rub in a clockwise fashion and see what number jumps into your head.

1768191512612.png


Cheers,
H.
 
  • Like
  • Fire
Reactions: 13 users
Hi SFB, I think that in statistics, the word 'average' is a collective name for mean, median and mode. The idea of an average is to use one number to 'typify' all of the numbers in the group/population/sample.

You have correctly calculated the mean. The sum of all the shares held divided by the number of shareholders. However, in this case, I don't think it is the best number to describe the 'average' holding.

As of 28 April 2025, the top 50 shareholders held about 48% of the stock. If we then look at the other 44950 shareholder who own the other 52% (or about 1.04 billion) the average (mean) shareholding is about 23000 shares.

I think that in this case the median shareholding is probably the best indicator as it is not skewed as much by the atypical holdings. This is why you always hear about median house prices and not average (meaning mean) house prices. A few big ones skew the results away from 'typical'.

I think that the best way to answer your own question is to look at the following table from the last annual report, put your hand on your tummy and rub in a clockwise fashion and see what number jumps into your head.

View attachment 94247

Cheers,
H.
Cheers H2, definitely makes sense what you're saying, especially when looking at the table and comparing the big discrepancies between the % values and number of shareholders for each shareholder range.
 
  • Like
Reactions: 2 users
Top Bottom