(…) I came across the name Fernando Sevilla Martínez before, in connection with Raúl Parada Medina, whom I first noticed liking BrainChip LinkedIn posts more than a year ago (and there have been many more since…

).
Given that Raúl Parada Medina describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, I had already suggested a connection to the Volkswagen Group via SEAT or CUPRA at the time.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-424590
View attachment 88422
View attachment 88424
IoT research specialist within the connected car project at a Spanish automobile… · Experience: CTTC · Education: Universitat Pompeu Fabra - Barcelona · Location: Castelldefels · 500+ connections on LinkedIn. View Raúl Parada Medina, PhD’s profile on LinkedIn, a professional community of 1...
www.linkedin.com
colabscatalunya.cat
View attachment 88425
Extremely likely the same Raúl Parada Medina whom you recently spotted asking for help with Akida in the DeGirum Community - very disappointingly, no one from our company appears to have been willing to help solve this problem for more than 3 months!
Why promote DeGirum for developers wanting to work with Akida and then not give assistance when needed? Not a good look, if we are to believe shashi from the DeGirum team, who wrote on February 12 he would forward Parada’s request to the BrainChip team, but apparently never got a reply.
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-461608
View attachment 88428
The issue continued, until it was eventually solved on 27 May by another DeGirum team member, stephan-degirum (presumably Stephan Sokolov, who recently demonstrated running the DeGirum PySDK directly on BrainChip hardware at the 2025 Embedded Vision Summit - see the video here:
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-469037)
Hi, I’m interested in the neuromorphic hardware available, Akida. How can I create my own code to be launched in the platform? Thanks
community.degirum.com
raul.parada.medina
May 27
Hi @alex and @shashi for your reply, it looks there is no update from Brianchip in this sense. Please, could you tell me how to upload this model in the platform? Age estimation (regression) example — Akida Examples documentation. Thanks!
1 Reply
shashiDeGirum Team
May 27
@stephan-degirum
Can you please help @raul.parada.medina ?
stephan-degirum
raul.parada.medina
May 27
Hello @raul.parada.medina , conversion of a model from BrainChip’s model zoo into our format is straightforward:
Once you have an Akida model object, like Step 4 in the example:
model_akida = convert(model_quantized_keras)
You’ll need to map the model to your device and then convert it to a compatible binary:
from akida import devices
# Map model onto your Akida device
dev = devices()[0]
try:
model_akida.map(dev, hw_only=True)
except RuntimeError:
model_akida.map(dev, hw_only=False)
# Extract the C++-compatible program blob
blob = model_akida.sequences[0].program
with open("model_cxx.fbz", "wb") as f:
f.write(blob)
print("C++-compatible model written to model_cxx.fbz")
Note: You want to be sure that the model is supported on your Akida device. There are many models on the BrainChip model zoo that are not compatible with their “version 1 IP” devices.
If your device is a v1 device, you’ll need to add a set_akida_version guard:
from cnn2snn import convert, set_akida_version, AkidaVersion
# Convert the model
with set_akida_version(AkidaVersion.v1):
model_akida = convert(model_quantized_keras)
model_akida.summary()
from akida import devices
# Map model onto your Akida device
# ... (see above)
for more information on v1/v2 model compatibility please see their docs: Akida models zoo — Akida Examples documentation
Once you have a model binary blob created:
Create a model JSON file adjacent to the blob by following Model JSON Structure | DeGirum Docs or by looking at existing BrainChip models on our AI Hub for reference: https://hub.degirum.com/degirum/brainchip
ModelPath is your binary model file
RuntimeAgent is AKIDA
DeviceType is the middle output from akida devices in all caps.
For example for if akida devices shows: PCIe/NSoC_v2/0 you put: NSOC_V2
Your JSON + binary model blob are now compatible with PySDK. Try running the inference on your device locally by specifying the full path to the JSON as a zoo_url, see: PySDK Package | DeGirum Docs
“For local AI hardware inferences you specify zoo_urlparameter as either a path to a local model zoo directory, or a path to model’s .json configuration file.”
You can then zip them up and upload them to your model zoo in our AI Hub.
Let me know if this helped.
P.S. we currently have v1 hardware in our cloud farm, and this model is the face estimation model for NSoC_v2: https://hub.degirum.com/degirum/brainchip/vgg_regress_age_utkface--32x32_quant_akida_NSoC_1
Anyway, as you had already noticed in your first post on this DeGirum enquiry, Raúl Parada Medina (assuming it is the same person, which I have no doubt about) and Fernando Sevilla Martínez are both co-authors of a paper on autonomous driving:
https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-450543
View attachment 88429
In fact, they have co-published two papers on autonomous driving, together with another researcher: Jordi Casas-Roma. He is director of the Master in Data Science at the Barcelona-based private online university Universitat Oberta de Catalunya, the same department where Fernando Sevilla Martínez got his Master’s degree in 2022 before moving to Wolfsburg the following year, where he now works as a data scientist at the headquarters of the Volkswagen Group.
View attachment 88430
View attachment 88426
View attachment 88427
Speaking of Raúl Parada Medina:
On 13 February I took this screenshot, but never followed up on it:
View attachment 88435
View attachment 88434
It appears the planned WISSA workshop as well as some others got eventually cancelled, as only three of the scheduled workshops actually took place in late June:
https://www.ie2025.fraunhofer.de/workshops/
View attachment 88436
Nevertheless it is evidence that Raúl Parada Medina’s work is also relevant in the field of smart agriculture.
Prior to that, he was part of the 5GMED project that ran from September 2020 to August 2024 (sorry, don’t have the time right now to look up the individual links - all the following screenshots were taken in mid-February).
View attachment 88437
View attachment 88438
View attachment 88439
View attachment 88440
View attachment 88441
View attachment 88442
Keep in mind that Raúl Parada Medina has a telecommunications background and works as a Senior Researcher for CTTC in Castelldefels near Barcelona, the Centre Tecnològic de Telecomunicacions de Catalunya.
So when he describes himself as an “IoT research specialist within the connected car project at a Spanish automobile manufacturer”, the emphasis is on “connected” rather than on “automobile”. Therefore, any upcoming research projects may not involve cars at all.
Available online 29 December 2025, 101862
Neuromorphic Solar Edge AI for Sustainable Wildfire Detection
Author
Raúl Parada
Centre Tecnològic de Telecomunicacions de Catalunya (CTTC/CERCA), Castelldefels, 08860, Catalonia, Spain
Available online 29 December 2025.
Cite
https://doi.org/10.1016/j.iot.2025.101862
Get rights and content
Highlights
- •
Neuromorphic edge AI drones achieve 87% solar autonomy for wildfire detection
- •
BrainChip Akida enables 4,200 patrol hours/year, 3× longer than CPU-based systems
- •
Fleet scaling reduces detection latency from 18 hours (1 drone) to 2.2 hours (8 drones)
Abstract
This paper presents a feasibility study of a solar-autonomous wildfire detection system using neuromorphic edge AI on fixed-wing drones. Through a comprehensive year-long simulation over Parc del Garraf (Catalonia), we evaluate three edge computing platforms, Raspberry Pi 4, Google Coral TPU, and BrainChip Akida, integrated into solar-optimized eBee X drones.
Results show that the BrainChip Akida achieves 4,200 patrol hours per year, nearly three times that of traditional CPU systems, while maintaining 87% solar energy autonomy. The Google Coral TPU and Raspberry Pi 4 reach 66% and 52% autonomy, respectively. Fleet scaling analysis demonstrates that increasing drone count from one to eight reduces median wildfire detection time from 18 to 2.2 hours, surpassing critical response thresholds. Seasonal analysis reveals Akida-based systems can operate fully on solar energy during summer and most of spring and fall, minimizing grid dependency. These findings establish neuromorphic computing as a foundational technology for sustainable, perpetual environmental monitoring within the Internet of Robotic Things (IoRT).
Graphical abstract
- Download: Download high-res image (296KB)
- Download: Download full-size image
Introduction
Wildfires are now responsible for billions of dollars in annual economic losses and hundreds of lives globally. In the Mediterranean basin, over 500,000 hectares are burned yearly, often with delayed detection times exceeding 12 hours. These delays critically hinder suppression efforts and exacerbate damage. As climate change intensifies droughts and heatwaves, the urgency for faster, autonomous detection systems grows [1]. Mediterranean ecosystems, in particular, face growing risk as traditional fire detection systems, relying on satellite imagery, human patrols, or fixed ground sensors, struggle with temporal latency, limited spatial coverage, and infrastructure dependency. Other proposals incorporate artificial intelligence (AI) and fifth generation (5G) for wildfire control [2], but such approaches still rely heavily on communication infrastructure.
Recent advances in unmanned aerial vehicles (UAVs) have opened new possibilities for continuous, real-time environmental monitoring. However, most current drone-based detection systems are hindered by significant energy limitations, requiring frequent manual recharging or access to grid-based infrastructure. This severely restricts their autonomy and scalability, especially in remote or high-risk natural environments where rapid response is critical.
The Internet of robotic things (IoRT) envisions fully autonomous, intelligent agents operating over long durations without human intervention. Achieving this vision in the context of wildfire monitoring demands breakthroughs in two main areas: (i) onboard decision-making through efficient Edge AI [3], and (ii) self-sustaining energy systems. While Edge AI reduces reliance on cloud connectivity and improves latency, traditional computing platforms such as CPUs or even typical TPUs consume too much power for extended operation. As such, they cannot meet the demands of uninterrupted surveillance missions without significant energy support.
Neuromorphic computing offers a compelling solution. By mimicking the efficiency of biological brains [4], neuromorphic processors such as BrainChip Akida enable ultra-low-power inference, operating at a fraction of the energy cost of conventional architectures. Coupled with solar energy harvesting, this opens the door to continuous aerial monitoring, without external energy input, for the first time to the best of our knowledge.
This work addresses the critical challenge of enabling truly autonomous environmental monitoring drones by combining neuromorphic Edge AI and solar energy systems. The core objectives of our study are:
- 1.
To quantify and compare the energy sustainability potential of neuromorphic and traditional edge computing platforms when deployed on UAVs in realistic wildfire surveillance scenarios.
- 2.
To simulate year-round operations using real solar irradiance and environmental data, modelling energy harvesting and consumption dynamics in detail.
- 3.
To establish practical benchmarks for operational availability, solar autonomy, and wildfire detection performance across different hardware configurations.
The rest of this paper is structured as follows: Section 2 reviews related work in UAV-based wildfire detection and sustainable edge AI. Section 3 details the simulation framework, hardware platforms, energy modeling, and fleet scaling strategies. Section 4 presents and analyzes the simulation results, including mission time allocation, solar autonomy, seasonal variations, and cost-effectiveness. Section 5 discusses the broader implications of these results for sustainable IoRT systems. Finally, Section 6 concludes the paper and outlines future research directions.
Access through your organization
Check access to the full text by signing in through your organization.
Access through your organization
Section snippets
Related Work
Recent years have witnessed significant progress in UAV-based wildfire detection, AI-enabled edge processing, and neuromorphic computing for environmental monitoring. Here, we provide a structured comparison of leading approaches and clarify the main research gaps that remain. …
Methodology
This section presents the complete simulation methodology designed to evaluate the energy autonomy, detection capacity, and mission viability of solar-powered drones equipped with different edge AI platforms, including neuromorphic computing. The simulation is implemented using Python and is verifiable via the accompanying codebase. It models the performance of drones operating autonomously over a full year, incorporating solar harvesting dynamics, energy consumption profiles, patrol logic, and…
Results and Analysis
This section presents the key simulation outcomes comparing the Raspberry Pi 4, Google Coral TPU, and BrainChip Akida. The results show that the Akida platform achieves 87% solar autonomy and 4,200 patrol hours per year—three times that of CPU-based systems. Analysis includes patrol time distribution, detection efficiency, seasonal performance, and cost-effectiveness. A fleet scaling study shows exponential reductions in wildfire detection latency with increasing drone count, achieving optimal…
Discussion and Implications
This section reflects on the broader implications of the simulation results, emphasizing the technological, environmental, and operational significance of solar-autonomous UAV systems. By analyzing how neuromorphic computing shifts the boundaries of edge AI deployment, we explore how energy efficiency, fleet scalability, and seasonal synergy contribute to a new paradigm in sustainable wildfire monitoring. Key findings are contextualized within the Internet of Robotic Things (IoRT) framework,…
Conclusions
This research demonstrates the first viable solar-autonomous wildfire detection system using neuromorphic edge AI, achieving 87% energy self-sufficiency through breakthrough power efficiency improvements. The comprehensive year-long simulation reveals that hardware architecture selection—specifically neuromorphic versus traditional computing—enables qualitatively different operational capabilities in autonomous systems. Key findings include a neuromorphic computing system enabling practical…
CRediT authorship contribution statement
Raúl Parada: Writing – review & editing, Writing – original draft, Visualization, Validation, Supervision, Software, Resources, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. …
Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper. …
References (84)
- F.S. Martínez
Engineering Applications of Artificial Intelligence
(2024)
- R. Bailon-Ruiz
Robotics and Autonomous Systems
(2022)
- M.N.A. Ramadan
Internet of Things
(2024)
- C. Carrillo
Journal of Computational Science
(2025)
- Q. Su
Neural Networks
(2024)
- S. Dubey
Energy Procedia
(2013)
- S. Chander
Energy Reports
(2015)
- S.A.M. Said
Applied Energy
(1990)
- S.C.S. Costa
Renewable and Sustainable Energy Reviews
(2016)
- M. Morey
Renewable Energy Focus
(2023)
View more references