BRN Discussion Ongoing

Frangipani

Top 20
Nurjana Technologies uploaded a video earlier today that highlights several of their product offerings, including one called NEBULA.

In a publication for the Italian Pavilion at the International Astronautical Congress (IAC) 2025 in Sydney (29 September to 3 October 2025), which represented 20 of Italy’s most innovative companies in the space sector, NEBULA is described as
“an edge AI solution using neuromorphic processors and advanced optical sensors, such as event- based cameras, to track small, fast and low-visibility space objects with exceptional efficiency, even in harsh conditions”.



View attachment 93018



View attachment 93019 View attachment 93020


A short new video by Pietro Andronico, CEO & Founder of Nurjana Technologies, reveals what NEBULA stands for: Nurjana Event Based Unit for Light Acquisition.

Check out the video on LinkedIn - below are some still pictures.

Sounds as if Nurjana’s Spacetech division has ambitious plans that - should their dreams come true - will take them to the Moon and beyond.

And possibly Akida, too?


7CCC2F53-8436-48E1-A24B-9225502F3BF7.jpeg


2A5908A9-65C1-4045-822B-F5EBF35B696B.jpeg

E074009E-9543-4CD5-874F-235311ECF39B.jpeg

23056AF2-8E20-4A71-B657-5F848F54FF8C.jpeg
4D6143AF-9565-44CF-993E-D7FAB77CA744.jpeg

805D29F7-8AAE-4827-A6EB-917B2C108540.jpeg


Presumably not yet on 6 December, though… 😉

6F3A8F9A-CA0F-4923-9922-29A46D662C38.jpeg


I assume it was NEBULA that Cecilia Pisano had been working on while still with Nurjana Technologies:






View attachment 93022



The video also mentions another technology they developed called NAIS that involves a swarm of drones:


View attachment 93021

Bingo!

Although the initial focus of the NEBULA project - as described in the following September 2024 presentation - rather appears to have been Space Situational Awareness (SSA):👇🏻


NT Nebula Spoke3 - KO Presentation 23-09-2024.pptx 2

5A550AD8-8CAA-44A0-8565-875D41057821.jpeg
9A6F57A0-4429-40D4-8D33-73DF05266922.jpeg


DA6E69CA-9B0B-4640-99B1-359C9D4569AE.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Frangipani

Top 20
Speaking of Cecilia Pisano, who left Nurjana Technologies in July.

She has just started a new job in Luxembourg, as a Trustworthy AI Scientist at LIST, the Luxembourg Institute of Science and Technology:


C5C7C471-FE75-49C5-83BF-A239069EA5B4.jpeg



38A864AE-73FE-4EB6-A6B2-905F915FB9C9.jpeg




76315DAB-F27D-4675-A56B-6BB7157D1E53.jpeg



But she’s not the only one there with hands-on Akida experience!

Flor Ortiz, who last headed the TelecomAI lab at Uni Luxembourg’s SnT*, started to work for LIST as a Senior Research & Technology Scientist in the Distributed & Intelligent Connectivity (DISCO) Research Group back in September:
*Interdisciplinary Centre for Security, Reliability and Trust


8EA29169-A374-4BFD-B417-0A0CFB1E22E3.jpeg

2FBCF720-1050-476B-B133-5298624AD7BB.jpeg




Her former colleague Geoffrey Eappen, who also worked with Akida while conducting research at Uni Luxembourg’s SnT, already switched employers within Luxembourg more than a year ago and has since been working for OQ Technology as a Telecommunication Engineer.



5BD72FE6-AC6A-4D0D-852E-365061FB168E.jpeg


8DFF29A0-AF23-4249-9B0A-F99F0374AECC.jpeg




16B2FD92-1900-4184-862E-41B9D89261C0.jpeg


7B20F8E0-CA51-4146-823A-CFAAE4A3257C.jpeg


All three researchers had very positive things to say about Akida - we should definitely keep an eye on them…
 
  • Like
  • Love
  • Fire
Reactions: 22 users

SERA2g

Founding Member
Speaking of Cecilia Pisano, who left Nurjana Technologies in July.

She has just started a new job in Luxembourg, as a Trustworthy AI Scientist at LIST, the Luxembourg Institute of Science and Technology:


View attachment 93581


View attachment 93582



View attachment 93583


But she’s not the only one there with hands-on Akida experience!

Flor Ortiz, who last headed the TelecomAI lab at Uni Luxembourg’s SnT*, started to work for LIST as a Senior Research & Technology Scientist in the Distributed & Intelligent Connectivity (DISCO) Research Group back in September:
*Interdisciplinary Centre for Security, Reliability and Trust


View attachment 93584
View attachment 93585



Her former colleague Geoffrey Eappen, who also worked with Akida while conducting research at Uni Luxembourg’s SnT, already switched employers within Luxembourg more than a year ago and has since been working for OQ Technology as a Telecommunication Engineer.



View attachment 93586

View attachment 93587



View attachment 93588

View attachment 93589

All three researchers had very positive things to say about Akida - we should definitely keep an eye on them…
Hi Frang

This post and your previous one re Nurjana are excellent finds :)

Cheers
 
  • Like
  • Love
  • Fire
Reactions: 23 users

Esq.111

Fascinatingly Intuitive.
  • Haha
  • Like
Reactions: 10 users
  • Haha
  • Like
Reactions: 7 users

Andy38

The hope of potential generational wealth is real
  • Like
  • Love
Reactions: 12 users

TECH

Regular
Oversubscription, well I was quietly surprised, maybe others were as well.
0.17 was attainable, 0.165 if you happened to be heading the pack in front of 5 million other hopefuls, though not many went through at that price.

So how are we positioned moving into 2026, very, very nicely in my opinion...a clear roadmap, chip developments, happy partners, progress within the company, the landscape is so much more advanced than a few years back, do we now have a real launch pad, I believe that we do, what do you lot think?

Is 2026 our year? or are we still dreaming, what's neuromorphic, what's Edge AI, what's an Event Based Processor, what's SNN mean, what's, what's what's......we have advanced and so have the companies whom matter.

Stay focused, Akida will deliver in my opinion
bots GIF
 
  • Like
  • Love
  • Fire
Reactions: 22 users

Frangipani

Top 20

Gregor Lenz, until recently CTO of our partner Neurobus (https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-456183) and co-author of Low-power Ship Detection in Satellite Images Using Neuromorphic Hardware alongside Douglas McLelland (https://arxiv.org/pdf/2406.11319) has joined the London-based startup Paddington Robotics (https://paddington-robotics.com/ - the website doesn’t yet have any information other than “Paddington Robotics - Embodied AI in Action”):

View attachment 81384



View attachment 81386



Some further info I was able to find about the London-based startup founded late last year, whose co-founder and CEO is Zehan Wang:

View attachment 81419


https://www.northdata.de/Paddington%20Robotics%20Ltd·,%20London/Companies%20House%2016015385

View attachment 81420 View attachment 81421 View attachment 81422 View attachment 81423

While Paddington Robotics aka P9R7 continue to be rather secretive about what they do on their minimalistic website (https://paddington-robotics.com), the London-based startup somewhat opened up about their work (initially cobots in supermarkets /grocery stores) on LinkedIn in recent weeks, where they also just introduced a few of their employees (according to the company profile, their team currently still consists of 10 people max).

From a BRN shareholder’s perspective, their most interesting team member is of course Gregor Lenz (see my tagged post above), who joined Paddington Robotics in April. With his PhD in Neuromorphic Computing, his SynSense background and hands-on experience with both Loihi and Akida, plus his deep tech-startup experience as Co-Founder and former CTO of Neurobus, he is the perfect guy to promote neuromorphic computing within his company in his role as P9R7’s perception stack lead.

“We are building robots which help people do more, not replace them.”
“We’re building for real world impact, not tech for technology’s sake.”
Zehan Wang, Founder and CEO of Paddington Robotics (P9R7)



791A70FA-8DCA-487D-AA95-D61B7D815A8A.jpeg




DD72778B-FFC7-4444-81E5-6C6947431F2E.jpeg




E45C83E3-8465-4A32-815C-0659914C03CE.jpeg




1D63CD6C-1E1F-454C-B00C-024044ECB0EB.jpeg




A2AD2AA0-279E-4213-B6A9-27D433B627C0.jpeg
2014E1D3-DB1F-484D-904A-31A936EA7087.jpeg
 
  • Like
  • Love
  • Fire
Reactions: 13 users
Maybe one step closer though.

Just up on GitHub.

Suggest readers absorb the whole post to understand the intent of this repository.

Especially terms such as federated learning, scalable, V2X, MQTT, prototype level and distributed.

From what I can find, if correct, the author is as below and doesn't necessarily mean VW involved but suspect would be aware of this work in some division / department.



Fernando Sevilla Martínez​




SevillaFe/SNN_Akida_RPI5

Fernando Sevilla MartínezSevillaFe​



SevillaFe/SNN_Akida_RPI5Public

Eco-Efficient Deployment of Spiking Neural Networks on Low-Cost Edge Hardware


SNN_Akida_RPI5​

Eco-Efficient Deployment of Spiking Neural Networks on Low-Cost Edge Hardware

This work presents a practical and energy-aware framework for deploying Spiking Neural Networks on low-cost hardware for edge computing. We detail a reproducible pipeline that integrates neuromorphic processing with secure remote access and distributed intelligence. Using Raspberry Pi and the BrainChip Akida PCIe accelerator, we demonstrate a lightweight deployment process including model training, quantization, and conversion. Our experiments validate the eco-efficiency and networking potential of neuromorphic AI systems, providing key insights for sustainable distributed intelligence. This letter offers a blueprint for scalable and secure neuromorphic deployments across edge networks.

1. Hardware and Software Setup​

The proposed deployment platform integrates two key hardware components: the RPI5 and the Akida board. Together, they enable a power-efficient, cost-effective N-S suitable for real-world edge AI applications.

2. Enabling Secure Remote Access and Distributed Neuromorphic Edge Networks​

The deployment of low-power N-H in networked environments requires reliable, secure, and lightweight communication frameworks. Our system enables full remote operability of the RPI5 and Akida board via SSH, complemented by protocol layers (Message Queuing Telemetry Transport (MQTT), WebSockets, Vehicle-to-Everything (V2X)) that support real-time, event-driven intelligence across edge networks.

3. Training and Running Spiking Neural Networks​

The training pipeline begins with building an ANN using TensorFlow 2.x, which will later be mapped to a spike-compatible format for neuromorphic inference. Because Akida board runs models using low-bitwidth integer arithmetic (4–8 bits), it is critical to align the training phase with these constraints to avoid significant post-training performance degradation.

4. Use case validation: Networked neuromorphic AI for distributed intelligence​

4.1 Use Case: If multiple RPI5 nodes or remote clients need to receive the classification results in real-time, MQTT can be used to broadcast inference outputs​

MQTT-Based Akida Inference Broadcasting​

This project demonstrates how to perform real-time classification broadcasting using BrainChip Akida on Raspberry Pi 5 with MQTT.

Project Structure​

mqtt-akida-inference/
├── config/ # MQTT broker and topic configuration
├── scripts/ # MQTT publisher/subscriber scripts
├── sample_data/ # Sample input data for inference
├── requirements.txt # Required Python packages


Usage​

  1. Install Mosquitto on RPI5
sudo apt update
sudo apt install mosquitto mosquitto-clients -y
sudo systemctl enable mosquitto
sudo systemctl start mosquitto

  1. Run Publisher (on RPI5)
python3 scripts/mqtt_publisher.py


  1. Run Subscriber (on remote device)
python3 scripts/mqtt_subscriber.py


  1. Optional: Monitor from CLI
mosquitto_sub -h <BROKER_IP> -t "akida/inference" -v

Akida Compatibility

python3 outputs = model_akida.predict(sample_image)


Real-Time Edge AI This use case supports event-based edge AI and real-time feedback in smart environments, such as surveillance, mobility, and robotics.

Configurations Set your broker IP and topic in config/config.py

4.2 Use Case: If the Akida accelerator is deployed in an autonomous driving system, V2X communication allows other vehicles or infrastructure to receive AI alerts based on neuromorphic-based vision​

This Use Cases simulates a lightweight V2X (Vehicle-to-Everything) communication system using Python. It demonstrates how neuromorphic AI event results, such as pedestrian detection, can be broadcast over a network and received by nearby infrastructure or vehicles.

Folder Structure​

V2X/
├── config.py # V2X settings
├── v2x_transmitter.py # Simulated Akida alert broadcaster
├── v2x_receiver.py # Listens for incoming V2X alerts
└── README.md


Use Case​

If the Akida accelerator is deployed in an autonomous driving system, this setup allows:

  • Broadcasting high-confidence AI alerts (e.g., "pedestrian detected")
  • Receiving alerts on nearby systems for real-time awareness

Usage​

1. Start the V2X Receiver (on vehicle or infrastructure node)​

python3 receiver/v2x_receiver.py

2. Run the Alert Transmitter (on an RPI5 + Akida node)​

python3 transmitter/v2x_transmitter.py

Notes​

  • Ensure that devices are on the same LAN or wireless network
  • UDP broadcast mode is used for simplicity
  • This is a prototype for real-time event-based message sharing between intelligent nodes

4.3 Use Case: If multiple RPI5-Akida nodes are deployed for federated learning, updates to neuromorphic models must be synchronized between devices​

Federated Learning Setup with Akida on Raspberry Pi 5​

This repository demonstrates a lightweight Federated Learning (FL) setup using neuromorphic AI models deployed on BrainChip Akida PCIe accelerators paired with Raspberry Pi 5 devices. It provides scripts for a centralized Flask server to receive model weight updates and a client script to upload Akida model weights via HTTP.

Overview​

Neuromorphic models trained on individual RPI5-Akida nodes can contribute updates to a shared model hosted on a central server. This setup simulates a federated learning architecture for edge AI applications that require privacy, low latency, and energy efficiency.

Repository Structure​

federated_learning/
├── federated_learning_server.py # Flask server to receive model weights
├── federated_learning_client.py # Client script to upload Akida model weights
├── model_utils.py # (Optional) Placeholder for weight handling utilities
├── model_training.py # (Optional) Placeholder for training-related code
└── README.md


Requirements​

  • Python 3.7+
  • Flask
  • NumPy
  • Requests
  • Akida Python SDK (required on client device)
Install the dependencies using:

pip install flask numpy requests

Getting Started​

1. Launch the Federated Learning Server​

On a device intended to act as the central server:

python3 federated_learning_server.py

The server will listen for HTTP POST requests on port 5000 and respond to updates sent to the /upload endpoint.

2. Configure and Run the Client​

On each RPI5-Akida node:

  • Ensure the Akida model has been trained.
  • Replace the SERVER_IP variable inside federated_learning_client.py with the IP address of the server.
  • Run the script:
python3 federated_learning_client.py

This will extract the weights from the Akida model and transmit them to the server in JSON format.

Example Response​

After a successful POST:

Model weights uploaded successfully.


If an error occurs (e.g., connection refused or malformed weights), you will see an appropriate status message.

Security Considerations​

This is a prototype-level setup for research. For real-world deployment:

  • Use HTTPS instead of HTTP.
  • Authenticate clients using tokens or API keys.
  • Validate the format and shape of model weights before acceptance.

Acknowledgements​

This implementation is part of a broader effort to demonstrate low-cost, energy-efficient neuromorphic AI for distributed and networked edge environments, particularly leveraging the BrainChip Akida PCIe board and Raspberry Pi 5 hardware.
Appears Fernando Sevilla Martínez (as per prev post & has links to VW) has just updated a GitHub repository again a few hours ago for their current paper.

Snipped the overview and key findings etc below but link worth a read through.

Some real positives in here imo.


SevillaFe/EcoEdgeAI-akida-macPublic

A comprehensive workflow for comparing energy efficiency between conventional hardware (Mac M-series GPU/CPU) and neuromorphic hardware (Akida on Raspberry Pi 5) for autonomous driving steering angle prediction.


EcoEdgeAI: Neuromorphic Computing for Sustainable Autonomous Driving​

License: MIT Python 3.10+ DOI

Comparative Eco-Efficiency Evaluation of CNN Deployment on Conventional vs. Neuromorphic Hardware for Autonomous Driving

This repository contains the complete experimental workflow, trained models, and analysis scripts for our paper:

"Sustainable Neuromorphic Edge Intelligence for Autonomous Driving: A Comparative Eco-Efficiency Evaluation"
Authors: F.Sevilla Martínez, Jordi Casas-Roma, Laia Subirats, Raú Parada]
Conference/Journal: In Review Year: 2025

Overview​

This work provides the first comprehensive hardware-measured evaluation of energy efficiency, accuracy trade-offs, and eco-efficiency scaling for CNN-based steering angle prediction deployed on:

  • Conventional Hardware: MacBook Pro M1 (Apple Silicon)
  • Neuromorphic Hardware: Raspberry Pi 5 + BrainChip Akida v1.0 NPU

Research Questions​

  1. Energy Efficiency: How much energy do NPUs save compared to conventional hardware?
  2. Accuracy Trade-offs: What is the accuracy cost of 4-bit quantization on neuromorphic hardware?
  3. Eco-Efficiency Scaling: Which CNN architectures achieve optimal energy-accuracy balance?

Key Contributions​

✅ Hardware-Measured Energy: Direct measurement via TC66 USB power meter (neuromorphic) and CodeCarbon (conventional)
✅ Three CNN Architectures: PilotNet (1.54M params), LaksNet (768K params), MiniNet (245K params)
✅ Complete Quantization Pipeline: Float32 → PTQ 4-bit → QAT refinement → Akida deployment
✅ Reproducible Workflow: End-to-end scripts from training to statistical analysis
✅ Energy-Error Rate (EER) Metric: Unified eco-efficiency evaluation framework


🔬 Key Findings​

Energy Efficiency (RQ1)​

  • 7.15× to 13.17× energy reduction (615-1,217% savings) on neuromorphic hardware
  • Lighter architectures achieve superior gains: MiniNet 13.17×, PilotNet 7.15×
  • Energy advantage driven 60-70% by throughput improvement (3.4×-7.3× faster)
  • 0.73 W sustained power for MiniNet vs. 3.86 W on Mac M1

Accuracy Trade-offs (RQ2)​

  • Inverse-U quantization pattern: Medium architectures suffer most (LaksNet +113% MSE)
  • Lighter architectures show best tolerance: MiniNet +51% MSE, PilotNet +93% MSE
  • QAT provides zero accuracy benefit across all architectures (4-bit capacity ceiling)
  • Practical impact: 3.4°-7.2° additional steering error

Eco-Efficiency Scaling (RQ3)​

  • 8.5× eco-efficiency improvement for optimal configuration (MiniNet)
  • Super-linear scaling with architectural simplification (2.0×-8.5× EER)
  • 11.7× carbon footprint reduction (12.9 μg → 1.1 μg CO₂ per 1,000 inferences)
  • 233 Hz control loop capability (4.30 ms latency) enables real-time operation

Project Status​

  • ✅ Training Pipeline: Complete
  • ✅ Quantization & Conversion: Complete
  • ✅ Benchmarking: Complete
  • ✅ Statistical Analysis: Complete
  • ✅ Paper Submission: In Review
  • ✅ Documentation: In Progress
  • ✅ Pre-trained Models: Complete
Last Updated: December 2025
 
  • Like
  • Fire
  • Love
Reactions: 17 users
Oversubscription, well I was quietly surprised, maybe others were as well.
0.17 was attainable, 0.165 if you happened to be heading the pack in front of 5 million other hopefuls, though not many went through at that price.

So how are we positioned moving into 2026, very, very nicely in my opinion...a clear roadmap, chip developments, happy partners, progress within the company, the landscape is so much more advanced than a few years back, do we now have a real launch pad, I believe that we do, what do you lot think?

Is 2026 our year? or are we still dreaming, what's neuromorphic, what's Edge AI, what's an Event Based Processor, what's SNN mean, what's, what's what's......we have advanced and so have the companies whom matter.

Stay focused, Akida will deliver in my opinion
bots GIF
Clearly 2026 is BrainChip year with Onsor and many other's bringing Akida to market.
Were going to need a bigger bank 🏦
 
  • Like
  • Love
Reactions: 10 users

7für7

Top 20
Clearly 2026 is BrainChip year with Onsor and many other's bringing Akida to market.
Were going to need a bigger bank 🏦
Exactly 👍 and if not..I would guess 2027
Wohooooo
 
  • Like
Reactions: 2 users

manny100

Top 20
While Paddington Robotics aka P9R7 continue to be rather secretive about what they do on their minimalistic website (https://paddington-robotics.com), the London-based startup somewhat opened up about their work (initially cobots in supermarkets /grocery stores) on LinkedIn in recent weeks, where they also just introduced a few of their employees (according to the company profile, their team currently still consists of 10 people max).

From a BRN shareholder’s perspective, their most interesting team member is of course Gregor Lenz (see my tagged post above), who joined Paddington Robotics in April. With his PhD in Neuromorphic Computing, his SynSense background and hands-on experience with both Loihi and Akida, plus his deep tech-startup experience as Co-Founder and former CTO of Neurobus, he is the perfect guy to promote neuromorphic computing within his company in his role as P9R7’s perception stack lead.

“We are building robots which help people do more, not replace them.”
“We’re building for real world impact, not tech for technology’s sake.”
Zehan Wang, Founder and CEO of Paddington Robotics (P9R7)



View attachment 93596



View attachment 93601



View attachment 93597



View attachment 93598



View attachment 93599 View attachment 93600
I guess if you can afford to buy a robot for house duties in say 2030/2 plus and if it can clean, wash clothes, dishes, maybe meals, watch the kids and help them with homework and several other tasks it could have a reasonable pay back period.
Like TVs once they sell in volume every home will have one or two.
 
  • Like
Reactions: 1 users

7für7

Top 20
I guess if you can afford to buy a robot for house duties in say 2030/2 plus and if it can clean, wash clothes, dishes, maybe meals, watch the kids and help them with homework and several other tasks it could have a reasonable pay back period.
Like TVs once they sell in volume every home will have one or two.
Robots will have a huge impact on our lives—whether in households, industry, or medicine—equipped with sensors that can carry out a range of measurements within seconds to deliver initial diagnoses. But I’m still skeptical about them being used for unsupervised childcare, and I also doubt they’ll replace jobs on a massive scale. Jobs mean income… and that income generates taxes and social security contributions (at least in Germany). If all of that disappears, the system collapses—unless we somehow start paying robots. 🫤

So I see robots and AI more as support rather than a replacement for human labor, etc.

Just my opinion.
 
  • Like
  • Thinking
Reactions: 5 users
Clearly 2026 is BrainChip year with Onsor and many other's bringing Akida to market.
Were going to need a bigger bank 🏦
Hopefully bigger than our current 😂

1764994295295.gif
 
  • Haha
  • Like
Reactions: 3 users

manny100

Top 20
Robots will have a huge impact on our lives—whether in households, industry, or medicine—equipped with sensors that can carry out a range of measurements within seconds to deliver initial diagnoses. But I’m still skeptical about them being used for unsupervised childcare, and I also doubt they’ll replace jobs on a massive scale. Jobs mean income… and that income generates taxes and social security contributions (at least in Germany). If all of that disappears, the system collapses—unless we somehow start paying robots. 🫤

So I see robots and AI more as support rather than a replacement for human labor, etc.

Just my opinion.
I think robotics will replace jobs no one really wants to do because its hard work.
The problem is especially for young people no work = boredom = trouble.
So maybe reduced hours?
Everyone thought when computers came in that mass jobs would be lost - but that never happened.
It will all sort itself out.
I think robotics with a 'hotline' to mum and dad would work with kids because robots run by rules and even though they learn and adapt would never let kids get away with anything plus you do not have the worry of leaving your kids with 'humans' unsupervised. I mean the humans unsupervised.
 
  • Like
Reactions: 4 users

Diogenese

Top 20
I think robotics will replace jobs no one really wants to do because its hard work.
The problem is especially for young people no work = boredom = trouble.
So maybe reduced hours?
Everyone thought when computers came in that mass jobs would be lost - but that never happened.
It will all sort itself out.
I think robotics with a 'hotline' to mum and dad would work with kids because robots run by rules and even though they learn and adapt would never let kids get away with anything plus you do not have the worry of leaving your kids with 'humans' unsupervised. I mean the humans unsupervised.
Never letting kids get away with anything sounds depressing.
 
  • Love
  • Like
Reactions: 3 users
Oversubscription, well I was quietly surprised, maybe others were as well.
0.17 was attainable, 0.165 if you happened to be heading the pack in front of 5 million other hopefuls, though not many went through at that price.

So how are we positioned moving into 2026, very, very nicely in my opinion...a clear roadmap, chip developments, happy partners, progress within the company, the landscape is so much more advanced than a few years back, do we now have a real launch pad, I believe that we do, what do you lot think?

Is 2026 our year? or are we still dreaming, what's neuromorphic, what's Edge AI, what's an Event Based Processor, what's SNN mean, what's, what's what's......we have advanced and so have the companies whom matter.

Stay focused, Akida will deliver in my opinion
bots GIF
Me personally I feel it'll be a steady race, but those glasses that predict seizure coming on, Will brainchip be recognised for this game breaking products,hmmm or is it all about the revenue, if so industry will only know the truth and hopefully they jump on board, abit like NASA could you imagine what should be said on news headlines around the world , a little known company brainchip is playing a major role in the space age going forward plus in military
 
  • Like
Reactions: 1 users
Robots will have a huge impact on our lives—whether in households, industry, or medicine—equipped with sensors that can carry out a range of measurements within seconds to deliver initial diagnoses. But I’m still skeptical about them being used for unsupervised childcare, and I also doubt they’ll replace jobs on a massive scale. Jobs mean income… and that income generates taxes and social security contributions (at least in Germany). If all of that disappears, the system collapses—unless we somehow start paying robots. 🫤

So I see robots and AI more as support rather than a replacement for human labor, etc.

Just my opinion.
The only problem with that logic is one company will introduce something and then their competitors will introduce similar to compete. Greed is another motivator because a Billionaire will want to be a Trillionaire and then Squllionaire. It will be someone else's problem to fix. Too late by then. Bit like nuclear weapons. Surely none would want them. One has them, they all want them.

SC
 
I think robotics will replace jobs no one really wants to do because its hard work.
The problem is especially for young people no work = boredom = trouble.
So maybe reduced hours?
Everyone thought when computers came in that mass jobs would be lost - but that never happened.
It will all sort itself out.
I think robotics with a 'hotline' to mum and dad would work with kids because robots run by rules and even though they learn and adapt would never let kids get away with anything plus you do not have the worry of leaving your kids with 'humans' unsupervised. I mean the humans unsupervised.
There was the case a few years ago, where a robot at Tesla, allegedly had a human worker by the throat.

SC
 
Top Bottom