BRN Discussion Ongoing

TopCat

Regular

No mention of any specific tech , but sounds something like the tech Aquimea has developed
 
  • Like
  • Fire
Reactions: 4 users

IloveLamp

Top 20

1000016125.jpg
1000016128.jpg
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

Neuromorphic Chips and 'Brain‑Like' Computing: How Next‑Gen Chips May Change Phones, Robots, and IoT Devices​

By Renz Soliman
Published: Dec 31 2025, 06:01 AM EST


Computer Chip


The next major leap in artificial intelligence may not come from larger datasets or more powerful cloud systems, but from microchips modeled after the human brain itself. Neuromorphic computing represents a new frontier where hardware is designed to mimic neurons and synapses, potentially redefining how machines learn, sense, and respond to their environments.
As edge AI and IoT technologies expand, these brain-inspired chips could bring unprecedented intelligence to devices once considered too small or low-powered for advanced computation.

What Is Neuromorphic Computing?​

Neuromorphic computing refers to the design of computer architectures inspired by the human brain's biological networks. Unlike traditional processors that follow the Von Neumann model, where memory and processing are separated, neuromorphic processors integrate both functions, allowing them to compute and store information simultaneously.
This approach mirrors how neurons and synapses operate in the brain, with billions of tiny nodes processing signals in parallel rather than sequentially. The result is a system capable of adapting, learning, and responding in real time, making it a major breakthrough for edge AI use cases in robotics, mobile devices, and IoT systems.
These brain-inspired chips, designed for low-power computing, aim to bring efficient, on-device intelligence to everyday electronics.

How Do Neuromorphic Chips Work Like the Human Brain?​

The foundation of neuromorphic computing lies in spiking neural networks (SNNs). Instead of transmitting continuous streams of data like conventional machine learning algorithms, SNNs communicate using discrete electrical pulses called "spikes." Each spike represents an event, such as a change in sensor input, allowing the chip to process only relevant information.
This event-driven model drastically reduces energy consumption because computation occurs only when needed. In environments where devices must make fast, autonomous decisions, such as self-driving vehicles, robotics, or industrial IoT systems, spiking neural networks improve efficiency and responsiveness.

By mimicking this sparse, asynchronous communication pattern, neuromorphic chips avoid the heavy power demands of traditional AI accelerators like GPUs. They also enable continuous learning in dynamic environments without extensive retraining, offering a potential path toward truly adaptive, energy-efficient machines.

What Makes Neuromorphic Processors More Efficient Than GPUs or CPUs?​

Traditional AI hardware, including CPUs and GPUs, excels in performing repetitive matrix operations. However, they are constrained by constant data movement between memory and processing units, a major energy and speed bottleneck. Neuromorphic processors, in contrast, eliminate this inefficiency by bringing computation closer to memory.
Since SNNs activate only when new data arrives, neuromorphic cores can remain idle most of the time, drastically reducing power usage. This is why brain-inspired chips with low power consumption are seen as key enablers for edge AI and IoT applications, where sustained cloud connectivity or large battery capacity is impractical.
Furthermore, neuromorphic systems can scale efficiently. Instead of increasing clock speeds or processing cores, designers can expand the network by adding more interconnected neurons, much like biological brains evolve with experience.
This architectural flexibility allows neuromorphic hardware to multitask, adapt to sensory data, and learn in ways that traditional architectures cannot match.

Neuromorphic Chips in Practice: Examples and Innovations​

Several technology companies and research centers are pushing the boundaries of neuromorphic hardware. Among them, Intel Loihi stands out as a leading experimental platform. The Intel Loihi neuromorphic applications project explores how event-based computation can accelerate tasks such as gesture recognition, robotic navigation, and sensory processing.
Loihi chips use a mesh of artificial neurons capable of asynchronous communication. Each neuron's state evolves over time, responding to spikes from connected neurons and updating its internal parameters in real-time. This structure allows Loihi to perform learning without requiring extensive cloud training, a crucial advantage for autonomous edge systems.
Other notable projects include IBM's TrueNorth, designed for large-scale neural simulations, and BrainChip's Akida, which targets real-time edge processing in embedded IoT devices.
These chips demonstrate how neuromorphic processors in robotics and autonomous systems can operate independently from centralized computing networks while delivering high-speed perception and decision-making.

Applications in Robotics, IoT, and Edge AI​

The most immediate beneficiaries of neuromorphic computing are likely to be robots, autonomous systems, and IoT devices operating at the network edge. These platforms demand real-time perception with minimal power consumption, an ideal match for neuromorphic architectures.
In robotics, neuromorphic chips enable adaptive control systems that can adjust movement or coordination on the fly. For instance, a drone equipped with neuromorphic vision could identify obstacles or track moving objects using only milliseconds of processing time.
In the IoT landscape, millions of connected sensors must capture data continuously while running on limited battery power. Embedding brain-inspired chips with low-power operation allows these devices to process information locally, transmit only essential insights, and function efficiently without cloud reliance.
Additionally, autonomous vehicles and industrial automation systems benefit from the chips' ability to perform complex motion detection, speech recognition, and environmental mapping, all while consuming a fraction of the energy required by traditional AI processors. Such capabilities align perfectly with edge AI IoT strategies focused on distributed, sustainable intelligence.

Real-World Examples of Neuromorphic Computing​

The progress in neuromorphic computing is already visible in early prototypes and pilot programs across industries:
  • Healthcare: Sensor-embedded wearable devices using neuromorphic processing can monitor vital signs, detect anomalies, and adapt alerts in real time.
  • Smart cities: Edge nodes equipped with neuromorphic processors help manage energy use, traffic congestion, and environmental monitoring with minimal latency.
  • Manufacturing: Neuromorphic vision systems allow robotic arms to identify defects, align components precisely, and optimize workflow through on-site learning.
  • Mobile technology: Research suggests that smartphones with integrated neuromorphic co-processors could deliver on-device AI capabilities like image understanding and voice recognition without constant cloud access.
These neuromorphic applications showcase how spiking-based computation can transform diverse fields that require intelligence at the edge with strict energy budgets.

Advantages and Challenges of Neuromorphic Chips​

Benefits
  • Energy efficiency: Event-driven computation consumes significantly less power.
  • Real-time adaptability: Systems can learn from experience and respond immediately.
  • Scalability: Networks can grow organically, similar to biological systems.
  • Privacy: On-device learning reduces data transfer, improving data security for IoT.
Challenges
  • Programming complexity: SNNs require specialized software frameworks distinct from standard AI libraries.
  • Hardware cost and maturity: Neuromorphic chips remain mostly in research or prototype phases.
  • Standardization: A lack of unified benchmarks and interoperability limits broader industry adoption.
These factors suggest that while neuromorphic computing promises breakthroughs in efficiency and autonomy, it will likely coexist with traditional AI accelerators for years to come.

Will Neuromorphic Chips Revolutionize Artificial Intelligence?​

Many researchers view neuromorphic computing as a natural evolution of AI hardware rather than a replacement. It complements deep learning models by offering adaptive, low-power computing suitable for real-world environments.
When combined with traditional cloud AI, neuromorphic edge devices could create hybrid systems capable of both high-level reasoning and real-time, local decision-making.
In the longer term, the integration of neuromorphic principles into consumer electronics could redefine what "smart" means in smartphones, wearables, and household devices. Rather than responding to pre-trained commands, these gadgets could continuously learn and adapt to user behavior, mirroring aspects of human cognition.

Toward Smarter, More Human Machines​

The field of neuromorphic computing is still in its early stages, but its potential is unmistakable. By building machines that process information like the human brain, efficiently, parallelly, and adaptively, engineers can unlock a new generation of intelligent, self-learning systems.
As innovation continues around Intel Loihi neuromorphic applications, spiking neural networks, and brain-inspired low-power chips, the next decade may witness a transition from cloud-dependent AI to distributed, energy-efficient cognitive computing. Phones, robots, and IoT devices might soon think and respond more like humans, quietly, efficiently, and always learning.

Frequently Asked Questions​

1. How are neuromorphic chips different from quantum computers?​

Neuromorphic chips mimic the human brain to perform adaptive, low-power tasks, while quantum computers use qubits to handle massive, complex calculations. Neuromorphic computing suits edge AI and IoT; quantum computing targets scientific and optimization problems.

2. Can neuromorphic chips work with AI frameworks like TensorFlow or PyTorch?​

Not directly. Neuromorphic systems use spiking neural networks, which differ from conventional models, but emerging tools like Intel's Lava are improving compatibility and integration.

3. What materials make neuromorphic chips possible?​

They often use memristors and non-volatile memory materials, which act like synapses by retaining past electrical states, key for energy-efficient learning and processing.

4. When will neuromorphic technology reach consumer devices?​

Experts predict early consumer use by the late 2020s or early 2030s, starting with smart sensors, wearables, and autonomous devices that need on-device intelligence.


 
  • Like
  • Love
  • Fire
Reactions: 29 users

manny100

Top 20
November 7, 2025 - Dan Ganousis, Brainchip — Ag Robotics Working Group
Dan presented a video for Brainchip on 7th November 2025 at AG Robotics conference (link above).
Its very good.
He is trying to organise the 'independents' who cannot AI compete to form a group and fund AI together - A agricultural Co op for this makes sense.
Interesting view with presentation slides.
He actually tells them how to do and from memory offers to help.
See a recent linked in post by Dan.
"" Nobody is going to go broke due to power limitations and lack of access to GPUs. Nvidia will recede due to neuromorphic chips that think like our brains"
I remember FF got shat on from a great height by many for suggesting AKIDA may well Muscle in on NVIDIA - had to laugh as Dan is now on that bandwagon - and he is highly qualified and experienced.
POST below:

Dan Ganousis• 3rd+

AI, EDA and Silicon IP Sales and Marketing
2w


Everyone assumes AI is all about GPUs and huge power consumption. That is NOT the case for inferencing in PCs and edge datacenters where neuromorphic computing is quickly being adopted.

PCs need GOPs not TOPs and Nvidia knows it ... The era of model training is ending.

Anyone still using a Sun Sparcstation?? 20 years ago ALL engineering ran on Sparc. Along came Linux and Boom! Disruption and the disappearance of Scott McNealy.

History repeats itself as GPUs, like SPARC, are replaced with non-proprietary inference accelerators. One top for one watt.

Nobody is going to go broke due to power limitations and lack of access to GPUs. Nvidia will recede due to neuromorphic chips that think like our brains


Like
1

1 reply1 Comment on Dan Ganousis’ comment

Paul Nation• 3rd+

--
1w


Dan GanousisI have already built proven documented prototype in a laptop.
 
  • Like
  • Love
  • Fire
Reactions: 13 users

TECH

Regular
Happy New Year All (y)

Casting our net far and wide has been a great strategy, worldwide that is, whether a company is large or small is
irrelevant, why, because none of us (including our company) can predict the future, any innovator can develop
a world changing invention at any time, whether that be in stealth mode or not, next minute a big whale starts
circling, we must and do support ALL Developers.

My view is during the next 6 months a lot of news is going to drop, BUT that doesn't automatically transpire into
a big share price rise, we have seen numerous times positive news released, then shorts or whomever attempt to
block positive sentiment, why? because any educated holder knows what we have, who we are engaged with and
where we are heading (without a clock), so all I'm trying to say, yes, the news flow is coming, but the growth based
on revenue, isn't going to appear until the period July 2026 through June 2027 and beyond.

Very happy to be proven wrong, but just trying to keep things realistic, no revenue increase, no share price increase.
and yes, I did chat with our Founder on Christmas Eve, and he's still contributing to our company as is Anil, why,
because they believe in what they both created.......brilliant disruptive technology that will benefit the masses.

Integrity still exists, trust your investment, despite the pressure of negative thoughts.

Tech 2026.
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Iseki

Regular
Happy New Year All (y)

Casting our net far and wide has been a great strategy, worldwide that is, whether a company is large or small is
irrelevant, why, because none of us (including our company) can predict the future, any innovator can develop
a world changing invention at any time, whether that be in stealth mode or not, next minute a big whale starts
circling, we must and do support ALL Developers.

My view is during the next 6 months a lot of news is going to drop, BUT that doesn't automatically transpire into
a big share price rise, we have seen numerous times positive news released, then shorts or whomever attempt to
block positive sentiment, why? because any educated holder knows what we have, who we are engaged with and
where we are heading (without a clock), so all I'm trying to say, yes, the news flow is coming, but the growth based
on revenue, isn't going to appear until the period July 2026 through June 2027 and beyond.

Very happy to be proven wrong, but just trying to keep things realistic, no revenue increase, no share price increase.
and yes, I did chat with our Founder on Christmas Eve, and he's still contributing to our company as is Anil, why,
because they believe in what they both created.......brilliant disruptive technology that will benefit the masses.

Integrity still exists, trust your investment, despite the pressure of negative thoughts.

Tech 2026.
This is very strange indeed.

Many of us have been supporting BRN for years and have seen some very bad mistakes made.

Make no bones about it - The question to ask the founder is and has always been:
Has Akida been commercialized into an actual end-user product yet? If not how long before that will happen?

It's not right to say you've spoken with the founder and they still believe in akida and so you will to, and hey it will benefit the masses to boot.

They spent 4 years trying to sell licenses, now we're chipping in extra cash because that was wrong. So now we need to make some akida chips (tape-out, fab) and sell them off the shelf. And hope some one
1. puts in a large order
2. integrates the chip into an end-user product
3. and hope that the end-user product sells so well that the OEM can put in a recourring order for millions.

We all know this process will take years. There is no June timeframe that can work for this.

We hired a lot of ex-arm employees because they sell licenses too. We even had strange posters telling us day and night that arm was licensing akida ( whole threads were set up) - for how else could arm be making ai chips? And not a wise word from you to point out how silly that notion was.

You can not help the founders by avoiding the issues facing the company. Your warm relationship with the founders is misplaced unless it can be used to tell the founders that hiring more marketing people, hiring more computer scientists to create SNN models that are left with BRN as no one else wants to pay for them won't work.

Mistakes like these, along with the strange idea that we redomicile in the US (thank you FF for stepping in and telling the founder that this would be a mistake) are damaging the company's integrity. Swift and radical action needs to be taken now to ensure the damage is not irreversible.
 
  • Like
  • Thinking
Reactions: 5 users

manny100

Top 20
Happy New Year All (y)

Casting our net far and wide has been a great strategy, worldwide that is, whether a company is large or small is
irrelevant, why, because none of us (including our company) can predict the future, any innovator can develop
a world changing invention at any time, whether that be in stealth mode or not, next minute a big whale starts
circling, we must and do support ALL Developers.

My view is during the next 6 months a lot of news is going to drop, BUT that doesn't automatically transpire into
a big share price rise, we have seen numerous times positive news released, then shorts or whomever attempt to
block positive sentiment, why? because any educated holder knows what we have, who we are engaged with and
where we are heading (without a clock), so all I'm trying to say, yes, the news flow is coming, but the growth based
on revenue, isn't going to appear until the period July 2026 through June 2027 and beyond.

Very happy to be proven wrong, but just trying to keep things realistic, no revenue increase, no share price increase.
and yes, I did chat with our Founder on Christmas Eve, and he's still contributing to our company as is Anil, why,
because they believe in what they both created.......brilliant disruptive technology that will benefit the masses.

Integrity still exists, trust your investment, despite the pressure of negative thoughts.

Tech 2026.
Yes, the nature of the business is that there is a time delay between deals and revenue. The market is future orientated so we should see SP rises when 'signed' deals are done.
I do not think we are that far from deals in 2026.
I still see the portfolio safety net value is backed by:
1. Forecast exponential growth in Edge AI.
2. BRN is the current leader with a visible roadmap showing continual tech improvements.
3 Raytheon/USAFRL, Parsons, Frontgrade Geisler, Bascom Hunter, Onsor and our connection with Lockheed-Martin/QV Cybersecurity validates the tech.
4. The growing portfolio with soon to be released GenAI/LLMs, Gen3 and Provenance further extends our lead, and
5. Further deals this year will provide further portfolio value and investor interest.
 
  • Fire
  • Like
  • Love
Reactions: 11 users

TECH

Regular
This is very strange indeed.

Many of us have been supporting BRN for years and have seen some very bad mistakes made.

Make no bones about it - The question to ask the founder is and has always been:
Has Akida been commercialized into an actual end-user product yet? If not how long before that will happen?

It's not right to say you've spoken with the founder and they still believe in akida and so you will to, and hey it will benefit the masses to boot.

They spent 4 years trying to sell licenses, now we're chipping in extra cash because that was wrong. So now we need to make some akida chips (tape-out, fab) and sell them off the shelf. And hope some one
1. puts in a large order
2. integrates the chip into an end-user product
3. and hope that the end-user product sells so well that the OEM can put in a recourring order for millions.

We all know this process will take years. There is no June timeframe that can work for this.

We hired a lot of ex-arm employees because they sell licenses too. We even had strange posters telling us day and night that arm was licensing akida ( whole threads were set up) - for how else could arm be making ai chips? And not a wise word from you to point out how silly that notion was.

You can not help the founders by avoiding the issues facing the company. Your warm relationship with the founders is misplaced unless it can be used to tell the founders that hiring more marketing people, hiring more computer scientists to create SNN models that are left with BRN as no one else wants to pay for them won't work.

Mistakes like these, along with the strange idea that we redomicile in the US (thank you FF for stepping in and telling the founder that this would be a mistake) are damaging the company's integrity. Swift and radical action needs to be taken now to ensure the damage is not irreversible.
I respect your opinion/s.

Let's just see how our year unfolds, have we made poor business decisions, I have personally stated yes over the years at different points, have funds been wasted at different points, yes, I believe they have, but I keep looking forward, positive changes have been made in my opinion, we have progressed as a company, all our current engagements aren't fluff, they're fact, living in the past achieves nothing.

Please respect my opinion/s.

Regards ...Tech.
 
  • Like
  • Fire
  • Love
Reactions: 9 users

itsol4605

Regular
 
  • Like
  • Fire
Reactions: 5 users
Top Bottom