BRN Discussion Ongoing

Diogenese

Top 20
Oh really

@zeeb0t, you know you hate free advertising
Hang on a sec, Felix - I haven't got the contact details yet.
 
  • Haha
  • Like
Reactions: 10 users

Frangipani

Top 20

CE28A586-FCE4-4278-88C0-B53C315D4EDB.jpeg





Cloud based AI vs Edge AI Topology

Cloud based AI vs Edge AI Topology

Connected, Low-power, Intelligent Devices at the Edge​


HaiLa

4,385 followers

July 10, 2025
Article by Patricia Bower, HaiLa VP Product

What’s the ‘Edge’ in Edge AI?​

The amount of data generated by billions of wireless connected devices—such as sensors, wearables, and smart appliances—is growing at an incredible rate. Analysts forecast global growth from 20 billion devices today to over 40 billion by 2030 (IoT Analytics, State of IoT 2024).

Using today’s communications networks, user data is sent from (for example) the smartwatch on your wrist to datacenters located miles from you or even across a continent. The data is processed in these facilities by clusters of artificial intelligence (AI) or machine learning (ML) systems and is generally referred to as cloud compute. There are three issues with this model:

  1. The time it takes for the data to traverse communications networks to the datacenter (latency)
  2. The heavy load on datacenter AI clusters, and resulting power consumption, for processing raw data from billions of devices
  3. The higher potential for personal data privacy breaches

The growing need for faster responses (real-time decision making), reduced loading on datacenters (cloud offload), and improved privacy (anonymized data) has therefore driven the rise of edge AI—where intelligent processing happens directly on the device itself, such as in a sensor or wearable. At the core of this shift are neural processing units (NPUs)—specialized AI processors that enable devices to "think locally” and to rely on optimized AI and ML training models.

Chip technology for datacenter AI processing is extremely energy intense. In these AI clusters, where complex computational tasks - such as climate modelling - are performed, compute operations are measured in exaFLOPS, or a million, trillion floating-point operations per second. Edge AI NPUs can support up to a few teraOPS, or trillion operations per second OPS, and are purpose-built to be extremely power efficient. This is important as many connected devices run on batteries.

What Applications are Driving Edge AI?​

Edge AI is already transforming many industries. Here are some notable applications:

Patient Monitoring
Hospitals and home-care settings are increasingly using AI-powered sensors to continuously monitor vital signs such as heart rate, respiration, and oxygen levels. By analyzing these signals locally, devices can detect early warning signs of health deterioration and alert caregivers—without the need to stream personal health data to the cloud. This approach protects patient privacy and saves bandwidth.

Personal Health Monitoring
Wearable devices like fitness trackers and smartwatches are incorporating edge AI to offer more intelligent insights into sleep patterns, stress levels, and activity trends. These devices process sensor data in real time, providing instant feedback and recommendations on the go while preserving battery life.

Retail
In retail environments, smart cameras and sensors are used for foot traffic analysis, shelf inventory monitoring, and customer behavior tracking. Edge AI allows these insights to be processed locally without storing or transmitting images, which supports both privacy and efficiency. For example, a store can detect when a shelf is empty and trigger restocking alerts automatically. Using a battery powered, modular solution with integrated, low-power Wi-Fi provides a simple installation into legacy environments.

Industrial Asset Monitoring and Anomaly Detection
Factories and industrial sites rely on sensors to monitor machinery. Edge AI enables these sensors to learn the normal operating patterns of equipment and detect anomalies like unusual vibrations or temperature spikes—early signs of failure. This allows for predictive maintenance, reducing downtime and maintenance costs.

The ability to collect intelligent data directly from edge devices reduces the need for cloud infrastructure and data transmission, which means that edge AI also lowers the overall cost of system deployment and operation.

Leveraging Low-power RF and Neuromorphic AI for Efficient Edge AI​

A major technical challenge in deploying edge AI is power consumption. Devices like health monitors or industrial sensors are often required to run for months or years on small batteries—or in some cases, without batteries at all. As mentioned, NPUs optimized for low power are an essential element to perform AI tasks like image recognition or anomaly detection using minimal energy. To address this need, BrainChip’s Akida™ neuromorphic technology relies on the principle of sparsity for power efficient, event-driven AI compute.

The inferenced data from NPU AI processing is only part of the equation. Devices must also have the ability to connect and collate this intelligent data from multiple devices in a local or personal area network. This is where low-power radio frequency (RF) technologies like Wi-Fi and Bluetooth Low Energy come in and where HaiLa is setting a new paradigm in just how efficient data transmission can be over these protocols. HaiLa’s extreme low power radio communications technology, paired with power-optimized edge compute, allows devices to send and receive inferenced data without draining energy reserves, making them perfect partners for edge AI systems.

HaiLa and BrainChip: Sensors Converge 2025 Connected Edge AI Demo​

At Sensors Converge in Santa Clara this year, HaiLa and BrainChip joined forces to demonstrate object classification via extreme low-power Wi-Fi.

Article content

HaiLa and BrainChip collaborated to showcase very low power Connected Edge AI Object Classification

This hardware prototype includes a camera module to capture object images, BrainChip’s NPU test platform which is pre-trained to recognize objects from a camera image capture, and HaiLa’s BSC2000 extreme low-power Wi-Fi radio chip which transmits the image class and type data via Wi-Fi to display as a simple icon on a dashboard.

The demo illustrates the potential use of connected, edge AI for applications using image recognition and classification in industrial, retail, and medical sectors where cost effective, low latency, and private data connections are key requirements.

The Future of Edge AI: Connected, Low-power, Intelligent Devices Everywhere​

HaiLa’s core specialization in extreme low-power radio technology over standard protocols like Wi-Fi, Bluetooth, and even cellular, delivers one of the critical enablers of pervasive edge AI: extreme low-power data transmission. Together with BrainChip’s innovative edge compute, this opens up a broad range of options for end-users to support energy-efficient, on-device AI.

Edge AI represents a powerful shift in how intelligent systems operate—bringing the power of AI directly to the devices at the heart of the connected world. By combining efficient NPUs with low-power wireless communication, edge AI systems can run independently, securely, and with minimal energy use. As more industries adopt this technology, we can expect smarter, more responsive, and more sustainable solutions for multiple applications.
Contact us to learn more:

info@haila.io
sales@brainchip.com
 
  • Like
  • Fire
  • Love
Reactions: 49 users

Frangipani

Top 20
Joao Martins, Editor-in-Chief of audioExpress, reports about his visit to Sensors Converge 2025 in The Audio Voice, “the Weekly Newsletter for the Audio Industry and Audio Product Developers”. Here are some excerpts and photos:


85ABF5A7-C5A6-41B1-BF17-EFEE7CB86F4B.jpeg




E8B8EAFA-7849-4990-9A9E-6FFBD1F653C5.jpeg
B262ACED-F5E3-4B35-8856-7ED602291E92.jpeg

5C7BB294-8569-4B63-B04C-30D624358AE1.jpeg
50225002-FD70-469A-B8E8-1DD3033DF335.jpeg


(…)


8A4EBEE5-1EC5-44C8-B0ED-CAB6AB1AE281.jpeg

8780C942-F58B-44E6-89A8-64D522445FBB.jpeg


(…)

From his audio-centric perspective, Joao Martins says all in all he was rather disappointed in this year’s Sensors Converge:

“I might be missing something, but what's the meaning of attending a sensors show to promote "IoT”? It's an obsolete terminology for something that was touted 10 years ago. No one wants devices "connected to the Internet" in the age of "edge AI" if you really want to push a buzzword. This is the age of self-sustained intelligence of things. Where sensors make sense but are mere extensions.

How those common threads could possibly converge around real-world applications was for me the missing link at this show. I could find many demonstrations that had to do with audio, and I've seen multiple references to hearing, earbuds, and even voice, but nothing that wasn't already demonstrated at other shows, sometimes going back as much as four years.

Together with the absence of many key companies in this space, I couldn't help feeling disappointed that there were no groundbreaking presentations on audio applications. And yet, some of the companies who hold the technology to make those happen were there, but they simply just didn't mention the applications.”



One of two personal highlights for him was a presentation by Mouna Elkhatib, CEO/CTO of AONDevices (who worked for BrainChip from December 2015 to May 2016; she and Peter van der Made are co-inventors of several patents, cf. https://patents.justia.com/inventor/mouna-elkhatib).

AONDevices certainly sound like a serious contender to BrainChip in the hearables market:

84323540-559D-457F-8C05-C6DF3382D208.jpeg

230FC39C-BBAD-48FE-A5E6-1344F2E6A293.jpeg


(…)
4651DEDA-D8BA-4174-805A-E5819825EBE0.jpeg


(…)
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 20 users

Frangipani

Top 20
Our partner fortiss and their partner Interactive Wear (a 20 year old company that emerged out of a management buy-out of Infineon Technologies’ Wearable Technology Solutions) will be “presenting neuromorphic computing and its applications in the defense sector” at the upcoming SVI Future Congress (SVI = Systemverbund Verteidigung im Innovationsraum) in Munich, aimed at a target audience [LOL 🎯] of security and defense industry experts.

I couldn’t find any additional information online about the collaboration between fortiss and Interactive Wear.



View attachment 87806


View attachment 87811




View attachment 87809 View attachment 87810



View attachment 87807


View attachment 87815



View attachment 87819

The event website of the fully-booked conference links to a list of countries published by the German Federal Ministry of the Interior regarding security clearance concerns.
Citizens of countries listed are barred from attending the 8/9 July conference.

Somewhat surprisingly, Ukraine is also on that list, which was published on June 8, 2022, just over three months after Russia had invaded Ukraine (again), and hasn’t been updated since. Possibly due to concerns about pro-Putin ethnic Russians with Ukrainian citizenship?

View attachment 87813

Today, Axel von Arnim shared more information and photos about what fortiss and Interactive Wear were exhibiting at TechHub SVI 2025 earlier this week 👆🏻:

Showcasing our neuromorphicly gesture-driven virtual mule robot for defence applications at the#TechHub-SVI defence conference in Munich. Together with our partners [sic] #InteractiveWear, we deliver wearable smart sensing solutions for defence and the medtech industry.”

While it doesn’t look like Akida was already part of that collaboration (not surprisingly, given the partnership between fortiss and BrainChip appears to be still quite young) and I can only spot Loihi, the showcase nevertheless demonstrates what fortiss mean when they say about themselves:

“As the Research Institute of the Free State of Bavaria for software-intensive Systems, fortiss stands for application-oriented cutting-edge research and sets standards in the research and transfer of highly innovative technologies. In close cooperation with academic and practice-oriented partners, we act as a bridge between science and industry and develop excellent solutions for the most pressing challenges of the digital world.”


On second thought: Have a look at the last two photos. Could that possibly be an Akida M.2 factor on the very left of the screen, which shows the Sensor Fusion Development Plattform MHX?
Doesn’t look identical, but similar? 🤔




609162DF-6724-4F08-90FF-7A8705CD78D1.jpeg


35F16BBB-F92C-410B-B606-915CFA84386C.jpeg


E5157A5F-E18D-4BDA-B770-439AD3A2648B.jpeg

F93501A5-EB63-485E-BD3E-9AB8C0A5AD0F.jpeg


49352E22-706E-4748-A155-A083093E6BAB.jpeg
FC9FFD8B-FC5F-4A9E-869A-BAC906C63615.jpeg
 
Last edited:
  • Like
  • Love
  • Thinking
Reactions: 17 users

Frangipani

Top 20
No doubt Frontgrade Gaisler will be spruiking GR801 - their first neuromorphic AI solution for space, powered by Akida - during their upcoming Asia Tour through India, Singapore and Japan:


091C555C-6090-457A-AD01-6B753DC1B83B.jpeg
 
  • Like
  • Fire
Reactions: 20 users

Frangipani

Top 20
  • Like
  • Love
Reactions: 16 users

Frangipani

Top 20
Last Friday, TH Nürnberg (Nuremberg Institute of Technology) had an Open Day in connection with a barbie. Wow, they surely must have grilled lots of Nürnberger Rostbratwürstchen, the city’s signature finger-long sausages flavoured with marjoram that Johann Wolfgang von Goethe is said to have loved so much that he had some sent to Weimar by (horse-powered) express mail…

(webpage in German only: https://www.th-nuernberg.de/studium-karriere/studienorientierung-und-studienwahl/studienwahl-bbq/)

Christian Axenie, Head of TH Nürnberg’s SPICES Lab, whose team came runner-up in the 2023 tinyML Pedestrian Hackathon utilising Akida (cf. https://iopscience.iop.org/article/10.1088/2634-4386/adcbcb/pdf), gave a presentation on Neuromorphic Computing for prospective computer science students. As we know, the SPICES (Sensorimotor Processing Intelligence and Control in Edge compute Systems) Lab boasts an impressive collection of neuromorphic hardware (https://cristianaxenie.info/spiceslab/) that is waiting to be explored:



D5588E87-BAF8-4773-88D0-8A059CEE5BD1.jpeg



3FA133B4-7608-4E96-B213-9BA2CB618979.jpeg
 
  • Like
  • Fire
Reactions: 12 users

Frangipani

Top 20

View attachment 88347




Cloud based AI vs Edge AI Topology

Cloud based AI vs Edge AI Topology

Connected, Low-power, Intelligent Devices at the Edge​


HaiLa

4,385 followers

July 10, 2025
Article by Patricia Bower, HaiLa VP Product

What’s the ‘Edge’ in Edge AI?​

The amount of data generated by billions of wireless connected devices—such as sensors, wearables, and smart appliances—is growing at an incredible rate. Analysts forecast global growth from 20 billion devices today to over 40 billion by 2030 (IoT Analytics, State of IoT 2024).

Using today’s communications networks, user data is sent from (for example) the smartwatch on your wrist to datacenters located miles from you or even across a continent. The data is processed in these facilities by clusters of artificial intelligence (AI) or machine learning (ML) systems and is generally referred to as cloud compute. There are three issues with this model:

  1. The time it takes for the data to traverse communications networks to the datacenter (latency)
  2. The heavy load on datacenter AI clusters, and resulting power consumption, for processing raw data from billions of devices
  3. The higher potential for personal data privacy breaches

The growing need for faster responses (real-time decision making), reduced loading on datacenters (cloud offload), and improved privacy (anonymized data) has therefore driven the rise of edge AI—where intelligent processing happens directly on the device itself, such as in a sensor or wearable. At the core of this shift are neural processing units (NPUs)—specialized AI processors that enable devices to "think locally” and to rely on optimized AI and ML training models.

Chip technology for datacenter AI processing is extremely energy intense. In these AI clusters, where complex computational tasks - such as climate modelling - are performed, compute operations are measured in exaFLOPS, or a million, trillion floating-point operations per second. Edge AI NPUs can support up to a few teraOPS, or trillion operations per second OPS, and are purpose-built to be extremely power efficient. This is important as many connected devices run on batteries.

What Applications are Driving Edge AI?​

Edge AI is already transforming many industries. Here are some notable applications:

Patient Monitoring
Hospitals and home-care settings are increasingly using AI-powered sensors to continuously monitor vital signs such as heart rate, respiration, and oxygen levels. By analyzing these signals locally, devices can detect early warning signs of health deterioration and alert caregivers—without the need to stream personal health data to the cloud. This approach protects patient privacy and saves bandwidth.

Personal Health Monitoring
Wearable devices like fitness trackers and smartwatches are incorporating edge AI to offer more intelligent insights into sleep patterns, stress levels, and activity trends. These devices process sensor data in real time, providing instant feedback and recommendations on the go while preserving battery life.

Retail
In retail environments, smart cameras and sensors are used for foot traffic analysis, shelf inventory monitoring, and customer behavior tracking. Edge AI allows these insights to be processed locally without storing or transmitting images, which supports both privacy and efficiency. For example, a store can detect when a shelf is empty and trigger restocking alerts automatically. Using a battery powered, modular solution with integrated, low-power Wi-Fi provides a simple installation into legacy environments.

Industrial Asset Monitoring and Anomaly Detection
Factories and industrial sites rely on sensors to monitor machinery. Edge AI enables these sensors to learn the normal operating patterns of equipment and detect anomalies like unusual vibrations or temperature spikes—early signs of failure. This allows for predictive maintenance, reducing downtime and maintenance costs.

The ability to collect intelligent data directly from edge devices reduces the need for cloud infrastructure and data transmission, which means that edge AI also lowers the overall cost of system deployment and operation.

Leveraging Low-power RF and Neuromorphic AI for Efficient Edge AI​

A major technical challenge in deploying edge AI is power consumption. Devices like health monitors or industrial sensors are often required to run for months or years on small batteries—or in some cases, without batteries at all. As mentioned, NPUs optimized for low power are an essential element to perform AI tasks like image recognition or anomaly detection using minimal energy. To address this need, BrainChip’s Akida™ neuromorphic technology relies on the principle of sparsity for power efficient, event-driven AI compute.

The inferenced data from NPU AI processing is only part of the equation. Devices must also have the ability to connect and collate this intelligent data from multiple devices in a local or personal area network. This is where low-power radio frequency (RF) technologies like Wi-Fi and Bluetooth Low Energy come in and where HaiLa is setting a new paradigm in just how efficient data transmission can be over these protocols. HaiLa’s extreme low power radio communications technology, paired with power-optimized edge compute, allows devices to send and receive inferenced data without draining energy reserves, making them perfect partners for edge AI systems.

HaiLa and BrainChip: Sensors Converge 2025 Connected Edge AI Demo​

At Sensors Converge in Santa Clara this year, HaiLa and BrainChip joined forces to demonstrate object classification via extreme low-power Wi-Fi.

Article content

HaiLa and BrainChip collaborated to showcase very low power Connected Edge AI Object Classification

This hardware prototype includes a camera module to capture object images, BrainChip’s NPU test platform which is pre-trained to recognize objects from a camera image capture, and HaiLa’s BSC2000 extreme low-power Wi-Fi radio chip which transmits the image class and type data via Wi-Fi to display as a simple icon on a dashboard.

The demo illustrates the potential use of connected, edge AI for applications using image recognition and classification in industrial, retail, and medical sectors where cost effective, low latency, and private data connections are key requirements.

The Future of Edge AI: Connected, Low-power, Intelligent Devices Everywhere​

HaiLa’s core specialization in extreme low-power radio technology over standard protocols like Wi-Fi, Bluetooth, and even cellular, delivers one of the critical enablers of pervasive edge AI: extreme low-power data transmission. Together with BrainChip’s innovative edge compute, this opens up a broad range of options for end-users to support energy-efficient, on-device AI.

Edge AI represents a powerful shift in how intelligent systems operate—bringing the power of AI directly to the devices at the heart of the connected world. By combining efficient NPUs with low-power wireless communication, edge AI systems can run independently, securely, and with minimal energy use. As more industries adopt this technology, we can expect smarter, more responsive, and more sustainable solutions for multiple applications.
Contact us to learn more:

info@haila.io
sales@brainchip.com

HaiLa CEO Derek Kuhn is doing an excellent job promoting our company: “the amazing technology from the team at BrainChip”.

Maybe he should consider taking up a side hustle as a BrainChip social media intern? 😉



739CC583-A0B0-41B4-B405-82A14362129D.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 34 users

jla

Regular
HaiLa CEO Derek Kuhn is doing an excellent job promoting our company: “the amazing technology from the team at BrainChip”.

Maybe he should consider taking up a side hustle as a BrainChip social media intern? 😉



View attachment 88373
God you put some into exploring BRN for us, God Bless you Fragipanl.
 
  • Like
  • Love
  • Fire
Reactions: 27 users
Interesting read here about Intel's downfall, the CEO providing a reality check, and the direction they're now heading in which Brainchip could play a part in:


"At one time, Intel was so powerful that it considered acquiring Nvidia for $20 billion. The GPU maker is now worth $4 trillion."

"Intel instead plans to shift its focus toward edge AI, aiming to bring AI processing directly to devices like PCs rather than relying on cloud-based compute. Tan also highlighted agentic AI—an emerging field where AI systems can act autonomously without constant human input—as a key growth area. He expressed optimism that recent high-level hires could help steer Intel back into relevance in AI, hinting that more talent acquisitions are on the way. “Stay tuned. A few more people are coming on board,” said Tan. At this point, Nvidia is simply too far ahead to catch up to, so it's almost exciting to see Intel change gears and look to close the gap in a different way."
 
  • Like
  • Fire
  • Love
Reactions: 37 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
This article, published a couple of days ago, outlines how grid edge computing is transforming modern power systems. It paints a picture of massively decentralized, real-time, intelligent power networks—a perfect environment for technologies like BrainChip’s neuromorphic AI to thrive.

The piece explicitly references neuromorphic computing as one of the emerging technologies shaping the future of the smart grid, alongside explainable AI, generative models, and collaborative AI systems.

The system-level challenges and requirements described align almost exactly with what BrainChip’s Akida and TENNs platforms were built to solve: ultra-low latency, energy efficiency, anomaly and fault detection, always-on AI, edge inference, on-device learning and security.

The smart grid + grid edge AI market is exploding, particularly due to the rise of:
  • Distributed Energy Resources
  • EV charging infrastructure
  • Smart meters and substations
  • Energy trading systems
  • Real-time fault detection / predictive maintenance
Itron (the author’s company) is already building AI into edge smart grid gateways.

The market size estimates for edge AI in energy & utilities are upwards of $3B today and expected to exceed $10B+ by 2030.

If BrainChip captured even 1% of the edge AI deployments in grid systems. - including relays, sensors, inverters, load balancers, and smart meters - it could mean tens to hundreds of millions in annual IP licensing or chip sales.

The other thing worth noting is that Itron has partnered with NVIDIA. As you can see from the last screenshot, they aim to utilize NVIDIA's Jetson Orin Nano. Given the article suggests that neurmorphic computing is an emerging technlogy that allows for more efficient computing, I ownder if they're considering combining Jetson Orin + neuromorphic.

For example, central nodes might run NVIDIA (Jetson/Orin) for heavy inference & cloud analytics. And fault sensors, relays, and smart meters could utilize Akida for monitoring waveform anomalies 24/7 without draining power.



How grid edge computing is revolutionising real-time power management​

Smart Energy International Jul 08, 2025
Share
How grid edge computing is revolutionising real-time power management

Stefan Zschiegner
From smart meters to predictive analytics, the grid of tomorrow will be built on real-time decision-making at the edge, writes Stefan Zschiegner of Itron.
As the saying goes, “the only constant in life is change,” and that is certainly true when we consider the technological advances being made in utility power management.
The traditional model, where data flows to central control centres and back, can no longer meet the demands of today’s complex, renewable-heavy power networks. A new framework, where computing power is being utilised at the grid edge, is driving transformation of electricity management.
A decentralised framework relies on intelligent edge devices capable of detecting anomalies in real time for prevention or to take action near real-time. For example, if lightning strikes a distribution pole, intelligent field devices can autonomously detect the fault, isolate the damaged section, reroute power and adjust voltage levels—all within seconds, often before the central system even registers the event. In this new framework, edge intelligence is essential for maintaining grid stability and integrating distributed energy resources (DERs).

New architecture for new challenges​

To enable this advanced intelligence, modern grid edge devices are evolving to include a rich array of features, such as advanced microprocessor relays, smart reclosers with embedded computing for autonomous fault isolation, intelligent power quality monitors with real-time waveform analysis and edge compute gateways with artificial intelligence (AI) capabilities and local storage. These devices connect through Field Area Networks (wireless mesh) for local communication and Wide Area Networks for backhaul to control centres.
As grid edge intelligence expands, central SCADA systems remain crucial. Modern architectures employ edge-first processing for time-critical decisions, hierarchical processing with multi-tier decision-making and protocol translation gateways for seamless communication.
Data flows in multiple patterns: horizontal flows facilitate peer-to-peer device communication, vertical flows maintain traditional telemetry and control, publish-subscribe models enable status updates and event-driven architectures coordinate responses across systems.

Advanced technical requirements​

Grid edge computing systems must meet strict requirements, including response times of single-or-double digit milliseconds for protection functions, sub-cycle responses for power quality correction, environmental hardening to operate in extreme conditions (-40°C to +85°C) and deterministic computing for guaranteed response times.
Modern grid intelligence typically employs a layered approach with an edge layer for immediate time-critical functions, a fog layer at substations for coordination across devices and a cloud layer for analytics, machine learning (ML) and enterprise integration.
As we’ve established, reaction times are key to maintaining grid integrity. The ultimate goal for modern edge systems is to operate within the microsecond range–responding faster than conventional systems and making critical decisions relating to:
  • Fault detection and isolation through high-speed algorithms and adaptive protection.
  • Power quality management with real-time harmonic mitigation and voltage compensation.
  • Load balancing via automated reconfiguration and microgrid management.
  • Voltage/VAR optimization through real-time control and reactive power management.
AI and ML can enhance these capabilities through pre-trained algorithms deployed on edge devices, federated learning and continuous refinement of decision-making. ML-enhanced systems using pre-trained AI platform chips can produce a performance 80 times greater than the same algorithm running on an Intel i5 processor without acceleration.
The impact of AI also transforms edge intelligence grid management from reactive to predictive. Deep learning and advanced analytics enable equipment health scoring based on operating conditions, time-to-failure predictions, optimized maintenance scheduling, AI-based anomaly detection and integration of environmental factors into predictive models.
In addition, the future of modern edge systems lies in enhancing grid stability through real-time load balancing made possible by multi-timeframe load forecasting, continuous power flow optimization, real-time phase monitoring and balancing, and customer load participation through automated control mechanisms.

The future of grid edge computing​

Emerging technologies are advancing grid edge intelligence through explainable AI (xAI) for transparent decision-making, neuromorphic computing for efficient AI processing, generative models for unexpected grid conditions and collaborative AI systems for decentralised coordination.
Edge-native applications are evolving with digital twins for predictive simulation, distributed ledger technology for secure transactions, autonomous grid agents for negotiation-based operation and immersive visualisation for field personnel. Integration with renewable energy systems will be crucial through direct device-to-device communication, peer-to-peer energy communities and regulatory frameworks that rely on edge intelligence.
As intelligence moves to the grid edge, security concerns have evolved due to expanded attack surfaces. With thousands of accessible devices, constrained computing resources, heterogeneous systems from multiple vendors and long-lived equipment creating legacy security concerns, mitigation strategies include defense-in-depth security, autonomous fallback modes, physical tamper protection, graceful degradation during attacks and AI-driven threat detection.

Implementation considerations​

Implementing the new framework is a significant undertaking and investment, and requires total cost of ownership analysis, value stacking for multiple benefit streams and risk-adjusted return calculation. Implementation approaches may include targeted deployment in high-value locations, phased rollout of capabilities and test bed validation.
The human element remains a critical success factor. Bridging the skills gap requires structured role-based training programs, simulation training and formal certification to ensure operational readiness and long-term workforce capability.
Regulatory compliance is equally essential. Navigating frameworks such as NERC CIP (North American Electric Reliability Corporation Critical Infrastructure Protection) requires robust cyber-security measures when entities are operating, controlling or interacting with the North American Bulk Electric System (BES) to protect against cyber threats and ensure grid reliability. In addition, organisations must meet reliability-reporting obligations, adhere to data privacy compliance and maintain detailed documentation to support regulatory audits and insight.
Finally, success is measured across three key metrics: technical performance (including response time and detection accuracy), operational benefits (such as improved reliability and reduced outages) and financial outcomes (like cost savings).

Conclusion​

As we move into an era of DERs, intelligence at the grid edge has become critical for maintaining a reliable power system. The transition from centralised to distributed intelligence represents a fundamental shift. The old principle of “centralise for optimisation, distribute for reliability” is giving way to “distribute intelligence to act where the problem occurs.”
The grid of tomorrow—sustainable, resilient and responsive—will be built on real-time decision-making at the edge. The future belongs to those who can set direction centrally but act locally, at the speed modern power systems demand.





Screenshot 2025-07-11 at 10.45.51 am.png




Screenshot 2025-07-11 at 10.46.00 am.png






Screenshot 2025-07-11 at 10.49.37 am.png
 

Attachments

  • Screenshot 2025-07-11 at 10.49.37 am.png
    Screenshot 2025-07-11 at 10.49.37 am.png
    79.7 KB · Views: 28
Last edited:
  • Like
  • Love
  • Fire
Reactions: 31 users

7für7

Top 20
Yeeeeehaaaaaaa

Roller Coaster GIF
 
  • Haha
Reactions: 2 users
Any thoughts please,
Is there any chance of BRN being integrated considering the time lines mentioned below by intel.?.

18A, Intel's proposed savior, is still a year away
 
Last edited:
  • Like
Reactions: 2 users

7für7

Top 20
Bravo, what are your thoughts please,
Is there any chance of BRN being integrated considering the time lines mentioned below by intel.?.

18A, Intel's proposed savior, is still a year away
I’m not Bravo but I like this question so…

Absolutely possible if you ask me…and actually, BRN is already inside the Intel Foundry ecosystem.

As we know…In September 2022, BrainChip was officially announced as part of the Intel Foundry Services Accelerator IP Alliance. That means Akida is available as a licensed IP block within Intel’s design ecosystem… including for upcoming nodes like 18A.

So yes … if an Intel Foundry customer (or Intel itself) wants to embed an ultra low power hot sh…t neuromorphic core, Akida is already in da house….

Will it actually be chosen for a major 18A product?

We don’t know yet…. I think even Sean don’t know…But the foundation is already there…and that alone puts BRN way ahead of many other edgeAI players.

Source
 
  • Like
  • Love
  • Fire
Reactions: 13 users

7für7

Top 20
They say you should invest in stocks… “Let your money work for you,” they said.

Well, if I take a look at my BrainChip shares, they’re acting more like moody teenagers who can’t be bothered to show up at their apprenticeship or move their lazy asses.

Go do your job … you useless pieces of paper and explode or I swear I’ll disown you from my portfolio!

All that’s missing now is an email from my Brainchip shares saying:
“Yo bro… this 9-to-5 investor path just isn’t my vibe. I wanna be a creator on social media.”

GET OUTAAA HEREEEE!!!!
 
  • Haha
  • Sad
Reactions: 4 users

TheDrooben

Pretty Pretty Pretty Pretty Good
  • Like
  • Fire
  • Love
Reactions: 45 users

TECH

Regular
  • Like
  • Love
  • Fire
Reactions: 43 users

View attachment 88382
View attachment 88383
View attachment 88384

Happy as Larry
Great find.

Was gonna give you a 🔥 like but downgraded it to a 👍 instead cause you didn't include the obligatory "Larry Gif" :ROFLMAO::LOL:
 
  • Haha
  • Like
  • Fire
Reactions: 16 users

7für7

Top 20

View attachment 88382
View attachment 88383
View attachment 88384

Happy as Larry
2022-2024 so they kicked him OUTA THEREEE?!?!

Just kidding 😂

But why he doesn’t hashtaged BrainChip? 🙄
 
  • Like
Reactions: 2 users
Top Bottom