BRN Discussion Ongoing

7fĂźr7

Top 20
Someone must have brought 800 shares
801 to be precise 😑☝️
And predict a total volume of 801 1/2239
 
  • Haha
Reactions: 2 users

Tothemoon24

Top 20
interesting like to Steve Brightfield’s post 🔜💰

IMG_1971.jpeg

IMG_1972.jpeg

IMG_1973.jpeg

IMG_1974.jpeg
 
  • Like
  • Fire
  • Love
Reactions: 11 users

Diogenese

Top 20
Hi TTM,

Just a reminder, this article describes the BRN exhibits for CES2026:

https://www.design-reuse.com/news/2...nding-ahead-of-ces-to-power-next-gen-edge-ai/

BrainChip Announces $25 Million (USD) Funding Ahead of CES to Power Next-Gen Edge AI​


Dec. 12, 2025 –

Funding accelerates commercialization of BrainChip’s on-device solutions with Akida 2 and Akida GenAI delivering smarter and faster solutions​

LAGUNA HILLS, Calif.-- BrainChip Holdings Ltd. (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of ultra-low-power, fully digital, event-based neuromorphic AI, secured a capital raise of $25 million to fuel the development and commercialization of its neuromorphic AI technology and expansion of its product offerings in chips and modules.

...
At CES, BrainChip will put its latest AI innovations on display, highlighting how the funding enables robust capabilities and partnerships such as:

  • AKD1500 modules to enable AI for rugged deployments with Industrial PCs
  • Always-on enablement with Pico evaluations on Akida Cloud
  • AI-powered cybersecurity with Akida with Quantum Ventura partnership
  • BrainChip 1.2 B LLM goes on-device in mobile and embedded devices
...
BrainChip will present cutting-edge demonstrations of the AKD1500 and AKD1000 in collaboration with various partners.

HaiLa Technologies will present ultra-low power Bluetooth and Wi-Fi integration with the AKD1500 for wearable visual classification.

Deep Perception will demonstrate a full visual compute pipeline using the AKD1000 for drones and mobile devices.

Quantum Ventura will highlight its Neuro RT cybersecurity model running on the Akida Edge AI Box, showing how it can protect small office networks
.
...

The QV CyberNeuro RT is encouraging as it means that the models are built and ready to run. The Edge Box will sell for $US1500, and I reckon the Neuro RT software will add a chunk to that. It is being pitched as business software. I would suspect that QV would be responsible for the software maintenance, although it may utilize federated learning to distribute any on-chip ML updates, so BRN may have a role in the maintenance.

PS: I'm still hoping for a USB stick for individual devices.
 
  • Like
  • Fire
  • Love
Reactions: 18 users

Sirod69

bavarian girl ;-)
1767028399504.png
 
  • Like
  • Love
  • Fire
Reactions: 19 users

FJ-215

Regular
Everything old????

BrainChip Signs Commercial Marketing Agreement with Global IP Supplier

ASX PRESS RELEASE
______________________________________________________________________________

5 August 2016
BrainChip Signs Commercial Marketing Agreement with Global IP
Supplier
BrainChip Holdings Ltd (“BrainChip” or the “Company”) (ASX: BRN), is pleased to
announce that it has signed a global commercial marketing agreement with T2M UG
(T2M) of Munich, Germany.
T2M is a well‐established global business that provides “a‐feet on the ground” presence
in the leading semiconductor business centers around the world and is highly experienced
in integrating new hi‐tech IP into foundry and client infrastructures.
BrainChip’s stated aim of developing strong relationships with key industry participants is
further enhanced by forming this business relationship. The T2M team is poised to
embark on a global rollout of the BrainChip products and IP to its client base over the
coming months and we expect to see some tangible benefits in the near term.
Nigel Dixon CEO of T2M said: “We are delighted to represent BrainChip and SNAP to our

global clients. The SNAP engine is a ground‐breaking technology that has the potential to
cause industry wide change that we look forward to being a part of.”
 
  • Thinking
  • Fire
  • Haha
Reactions: 6 users
FF
Brainchip’s main competitor Intel facing further challenges:

 
  • Like
  • Wow
Reactions: 6 users
Last edited:
  • Like
  • Love
Reactions: 4 users

Frangipani

Top 20
(…) I came across the name Fernando Sevilla Martínez before, in connection with Raúl Parada Medina, whom I first noticed liking BrainChip LinkedIn posts more than a year ago (and there have been many more since… 😊).

Given that Raúl Parada Medina describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, I had already suggested a connection to the Volkswagen Group via SEAT or CUPRA at the time.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-424590

View attachment 88422
View attachment 88424





View attachment 88425

Extremely likely the same RaĂşl Parada Medina whom you recently spotted asking for help with Akida in the DeGirum Community - very disappointingly, no one from our company appears to have been willing to help solve this problem for more than 3 months!

Why promote DeGirum for developers wanting to work with Akida and then not give assistance when needed? Not a good look, if we are to believe shashi from the DeGirum team, who wrote on February 12 he would forward Parada’s request to the BrainChip team, but apparently never got a reply.

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-461608

View attachment 88428

The issue continued, until it was eventually solved on 27 May by another DeGirum team member, stephan-degirum (presumably Stephan Sokolov, who recently demonstrated running the DeGirum PySDK directly on BrainChip hardware at the 2025 Embedded Vision Summit - see the video here: https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-469037)





raul.parada.medina
May 27

Hi @alex and @shashi for your reply, it looks there is no update from Brianchip in this sense. Please, could you tell me how to upload this model in the platform? Age estimation (regression) example — Akida Examples documentation. Thanks!

1 Reply



shashiDeGirum Team
May 27

@stephan-degirum
Can you please help @raul.parada.medina ?




stephan-degirum
143_2.png
raul.parada.medina
May 27

Hello @raul.parada.medina , conversion of a model from BrainChip’s model zoo into our format is straightforward:
Once you have an Akida model object, like Step 4 in the example:
model_akida = convert(model_quantized_keras)

You’ll need to map the model to your device and then convert it to a compatible binary:


from akida import devices

# Map model onto your Akida device
dev = devices()[0]
try:
model_akida.map(dev, hw_only=True)
except RuntimeError:
model_akida.map(dev, hw_only=False)

# Extract the C++-compatible program blob
blob = model_akida.sequences[0].program
with open("model_cxx.fbz", "wb") as f:
f.write(blob)

print("C++-compatible model written to model_cxx.fbz")

Note: You want to be sure that the model is supported on your Akida device. There are many models on the BrainChip model zoo that are not compatible with their “version 1 IP” devices.
If your device is a v1 device, you’ll need to add a set_akida_version guard:

from cnn2snn import convert, set_akida_version, AkidaVersion
# Convert the model
with set_akida_version(AkidaVersion.v1):
model_akida = convert(model_quantized_keras)
model_akida.summary()

from akida import devices
# Map model onto your Akida device
# ... (see above)

for more information on v1/v2 model compatibility please see their docs: Akida models zoo — Akida Examples documentation

Once you have a model binary blob created:

Create a model JSON file adjacent to the blob by following Model JSON Structure | DeGirum Docs or by looking at existing BrainChip models on our AI Hub for reference: https://hub.degirum.com/degirum/brainchip
ModelPath is your binary model file
RuntimeAgent is AKIDA
DeviceType is the middle output from akida devices in all caps.
For example for if akida devices shows: PCIe/NSoC_v2/0 you put: NSOC_V2
Your JSON + binary model blob are now compatible with PySDK. Try running the inference on your device locally by specifying the full path to the JSON as a zoo_url, see: PySDK Package | DeGirum Docs
“For local AI hardware inferences you specify zoo_urlparameter as either a path to a local model zoo directory, or a path to model’s .json configuration file.”
You can then zip them up and upload them to your model zoo in our AI Hub.
Let me know if this helped.
P.S. we currently have v1 hardware in our cloud farm, and this model is the face estimation model for NSoC_v2:
https://hub.degirum.com/degirum/brainchip/vgg_regress_age_utkface--32x32_quant_akida_NSoC_1


Anyway, as you had already noticed in your first post on this DeGirum enquiry, RaĂşl Parada Medina (assuming it is the same person, which I have no doubt about) and Fernando Sevilla MartĂ­nez are both co-authors of a paper on autonomous driving:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-450543

View attachment 88429

In fact, they have co-published two papers on autonomous driving, together with another researcher: Jordi Casas-Roma. He is director of the Master in Data Science at the Barcelona-based private online university Universitat Oberta de Catalunya, the same department where Fernando Sevilla Martínez got his Master’s degree in 2022 before moving to Wolfsburg the following year, where he now works as a data scientist at the headquarters of the Volkswagen Group.


View attachment 88430


View attachment 88426 View attachment 88427
Speaking of RaĂşl Parada Medina:

On 13 February I took this screenshot, but never followed up on it:

View attachment 88435

View attachment 88434

It appears the planned WISSA workshop as well as some others got eventually cancelled, as only three of the scheduled workshops actually took place in late June:

https://www.ie2025.fraunhofer.de/workshops/

View attachment 88436

Nevertheless it is evidence that Raúl Parada Medina’s work is also relevant in the field of smart agriculture.


Prior to that, he was part of the 5GMED project that ran from September 2020 to August 2024 (sorry, don’t have the time right now to look up the individual links - all the following screenshots were taken in mid-February).


View attachment 88437 View attachment 88438 View attachment 88439 View attachment 88440 View attachment 88441 View attachment 88442
6d1b6b4f-ec16-481d-97a0-eb80b55872eb-jpeg.88443

Keep in mind that Raúl Parada Medina has a telecommunications background and works as a Senior Researcher for CTTC in Castelldefels near Barcelona, the Centre Tecnològic de Telecomunicacions de Catalunya.

So when he describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, the emphasis is on “connected” rather than on “automobile”. Therefore, any upcoming research projects may not involve cars at all.




Elsevier

Internet of Things

Available online 29 December 2025, 101862
Internet of Things

Neuromorphic Solar Edge AI for Sustainable Wildfire Detection​


Author
RaĂşl Parada
Centre Tecnològic de Telecomunicacions de Catalunya (CTTC/CERCA), Castelldefels, 08860, Catalonia, Spain
Available online 29 December 2025.



Cite
https://doi.org/10.1016/j.iot.2025.101862
Get rights and content

Highlights​

  • •
    Neuromorphic edge AI drones achieve 87% solar autonomy for wildfire detection
  • •
    BrainChip Akida enables 4,200 patrol hours/year, 3× longer than CPU-based systems
  • •
    Fleet scaling reduces detection latency from 18 hours (1 drone) to 2.2 hours (8 drones)

Abstract​

This paper presents a feasibility study of a solar-autonomous wildfire detection system using neuromorphic edge AI on fixed-wing drones. Through a comprehensive year-long simulation over Parc del Garraf (Catalonia), we evaluate three edge computing platforms, Raspberry Pi 4, Google Coral TPU, and BrainChip Akida, integrated into solar-optimized eBee X drones.

Results show that the BrainChip Akida achieves 4,200 patrol hours per year, nearly three times that of traditional CPU systems, while maintaining 87% solar energy autonomy. The Google Coral TPU and Raspberry Pi 4 reach 66% and 52% autonomy, respectively. Fleet scaling analysis demonstrates that increasing drone count from one to eight reduces median wildfire detection time from 18 to 2.2 hours, surpassing critical response thresholds. Seasonal analysis reveals Akida-based systems can operate fully on solar energy during summer and most of spring and fall, minimizing grid dependency. These findings establish neuromorphic computing as a foundational technology for sustainable, perpetual environmental monitoring within the Internet of Robotic Things (IoRT).

Graphical abstract​

1-s2.0-S2542660525003762-ga1.jpg

  1. Download: Download high-res image (296KB)
  2. Download: Download full-size image

Introduction​

Wildfires are now responsible for billions of dollars in annual economic losses and hundreds of lives globally. In the Mediterranean basin, over 500,000 hectares are burned yearly, often with delayed detection times exceeding 12 hours. These delays critically hinder suppression efforts and exacerbate damage. As climate change intensifies droughts and heatwaves, the urgency for faster, autonomous detection systems grows [1]. Mediterranean ecosystems, in particular, face growing risk as traditional fire detection systems, relying on satellite imagery, human patrols, or fixed ground sensors, struggle with temporal latency, limited spatial coverage, and infrastructure dependency. Other proposals incorporate artificial intelligence (AI) and fifth generation (5G) for wildfire control [2], but such approaches still rely heavily on communication infrastructure.

Recent advances in unmanned aerial vehicles (UAVs) have opened new possibilities for continuous, real-time environmental monitoring. However, most current drone-based detection systems are hindered by significant energy limitations, requiring frequent manual recharging or access to grid-based infrastructure. This severely restricts their autonomy and scalability, especially in remote or high-risk natural environments where rapid response is critical.

The Internet of robotic things (IoRT) envisions fully autonomous, intelligent agents operating over long durations without human intervention. Achieving this vision in the context of wildfire monitoring demands breakthroughs in two main areas: (i) onboard decision-making through efficient Edge AI [3], and (ii) self-sustaining energy systems. While Edge AI reduces reliance on cloud connectivity and improves latency, traditional computing platforms such as CPUs or even typical TPUs consume too much power for extended operation. As such, they cannot meet the demands of uninterrupted surveillance missions without significant energy support.

Neuromorphic computing offers a compelling solution. By mimicking the efficiency of biological brains [4], neuromorphic processors such as BrainChip Akida enable ultra-low-power inference, operating at a fraction of the energy cost of conventional architectures. Coupled with solar energy harvesting, this opens the door to continuous aerial monitoring, without external energy input, for the first time to the best of our knowledge.


This work addresses the critical challenge of enabling truly autonomous environmental monitoring drones by combining neuromorphic Edge AI and solar energy systems. The core objectives of our study are:
  • 1.
    To quantify and compare the energy sustainability potential of neuromorphic and traditional edge computing platforms when deployed on UAVs in realistic wildfire surveillance scenarios.
  • 2.
    To simulate year-round operations using real solar irradiance and environmental data, modelling energy harvesting and consumption dynamics in detail.
  • 3.
    To establish practical benchmarks for operational availability, solar autonomy, and wildfire detection performance across different hardware configurations.
The rest of this paper is structured as follows: Section 2 reviews related work in UAV-based wildfire detection and sustainable edge AI. Section 3 details the simulation framework, hardware platforms, energy modeling, and fleet scaling strategies. Section 4 presents and analyzes the simulation results, including mission time allocation, solar autonomy, seasonal variations, and cost-effectiveness. Section 5 discusses the broader implications of these results for sustainable IoRT systems. Finally, Section 6 concludes the paper and outlines future research directions.



Access through your organization​

Check access to the full text by signing in through your organization.
Access through your organization



Section snippets


Related Work​

Recent years have witnessed significant progress in UAV-based wildfire detection, AI-enabled edge processing, and neuromorphic computing for environmental monitoring. Here, we provide a structured comparison of leading approaches and clarify the main research gaps that remain. …

Methodology​

This section presents the complete simulation methodology designed to evaluate the energy autonomy, detection capacity, and mission viability of solar-powered drones equipped with different edge AI platforms, including neuromorphic computing. The simulation is implemented using Python and is verifiable via the accompanying codebase. It models the performance of drones operating autonomously over a full year, incorporating solar harvesting dynamics, energy consumption profiles, patrol logic, and…

Results and Analysis​

This section presents the key simulation outcomes comparing the Raspberry Pi 4, Google Coral TPU, and BrainChip Akida. The results show that the Akida platform achieves 87% solar autonomy and 4,200 patrol hours per year—three times that of CPU-based systems. Analysis includes patrol time distribution, detection efficiency, seasonal performance, and cost-effectiveness. A fleet scaling study shows exponential reductions in wildfire detection latency with increasing drone count, achieving optimal…

Discussion and Implications​

This section reflects on the broader implications of the simulation results, emphasizing the technological, environmental, and operational significance of solar-autonomous UAV systems. By analyzing how neuromorphic computing shifts the boundaries of edge AI deployment, we explore how energy efficiency, fleet scalability, and seasonal synergy contribute to a new paradigm in sustainable wildfire monitoring. Key findings are contextualized within the Internet of Robotic Things (IoRT) framework,…

Conclusions​

This research demonstrates the first viable solar-autonomous wildfire detection system using neuromorphic edge AI, achieving 87% energy self-sufficiency through breakthrough power efficiency improvements. The comprehensive year-long simulation reveals that hardware architecture selection—specifically neuromorphic versus traditional computing—enables qualitatively different operational capabilities in autonomous systems. Key findings include a neuromorphic computing system enabling practical…

CRediT authorship contribution statement​

Raúl Parada: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. …

Declaration of competing interest​

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. …


References (84)​

View more references
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 34 users

7fĂźr7

Top 20


“And I would like to say….To all my friends and enemies….. and particularly to Brad…where is Brad? Good guy… one of the best guys I met…
Let’s make the next year one of the most intelligent years you will ever see…. Because this year was DUM(P)… So dumb I can tell ya! Happy new year!”

Donald Trump Idk GIF by Election 2016



By the way…. I don’t think we will hear something huge from the CES2026…. Because everything what happens in Vegas, stays in Vegas! Very smart move from the organisers!
🥸
 
Last edited:
  • Haha
  • Like
  • Thinking
Reactions: 3 users

Frangipani

Top 20
FF
AiLabs a Brainchip partner has upgraded their website since last I took a look.
The following link takes you to 'News". They only have two news items and both relate to Brainchip.
The first one will be well known to genuine shareholders and investors but I had not seen the second which has involved quite a bit of work to present by AiLabs:

https://ailabsinc.com/news-event/details2

Not at all.
It was just a simple “copy & paste” job by someone at Ai Labs…

EEAEA927-DC1D-471C-A0EE-AAFD7417A769.jpeg


583BA53F-284E-4C2A-99C0-04B273EBCACB.jpeg





495D0C00-BBC2-4FE2-AC13-87D7A541999A.jpeg
 
  • Fire
  • Like
  • Love
Reactions: 10 users

manny100

Top 20
FF over on the crapper posted a link to a paper titled " PROVENANCE NETWORKS: END-TO-END EXEMPLAR BASED EXPLAINABILITY"
FF said is was huge and it really is.
Basically Provenance networks give Brainchip’s AI the ability to show exactly which past examples influenced each decision, delivering built‑in transparency and trust that competitors cannot match.
Provenance networks = AI Trust = Brand power.
Its software.
The detail in the paper indicates that work on this project is likely fairly well advanced but published now for strategic reasons.
These will include:
Signals to clients - its coming - get on board now.
Brainchip offers safe and explainable AI. You can nail a decision/reason down to a mili second.
It will be a requirnment for Defense, auto and Health diagnostics, if not now - soon.
Furthers Brainchips intellectual and innovative leadership - first to publish.
Work like this attracts top talent.
It appears none of the major AI‑hardware companies (NVIDIA, Intel, Qualcomm, AMD, Google, Apple, etc.) offer anything like provenance networks. What BRN describe in the paper seems unique and that’s why it’s strategically valuable.
Competitive advantage.
 
  • Like
  • Love
  • Fire
Reactions: 28 users
Any thoughts on this technology at the edge and if a competition to brn in the near future ?.
 

Attachments

  • Screenshot_20251230_162239_LinkedIn.jpg
    Screenshot_20251230_162239_LinkedIn.jpg
    321.5 KB · Views: 53
Last edited:
  • Like
Reactions: 1 users

Rach2512

Regular
Prophesee


 
Top Bottom