BRN Discussion Ongoing

Yes sure.. would be nice ..
All I want to say is that lately there have been a lot of inaccurate reports in papers like these about BrainChip. You shouldn’t take anything at face value without doing your own research or a verified “yes” from the company. I’d personally be happy if Sony had invested in BrainChip and was trying to build something with our products…but I can’t find anything credible, and I’m not a lemming who blindly follows others and take everything as facts... If BrainChip verifies it in the near future, great.. until then I’m marking it down as a misinterpretation or a mix-up.
Fair enough. Needs verification

SC
 
  • Like
Reactions: 3 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I'll see your 7 billion humanoid robots and raise you 21 billion cybersecurity chips.


21 billion cyberchips?

OK, then I'll raise you!

I underestimated my robotic chip numbers.

According to Elon Musk, we're talking a trillion-dollar robot army!!!!!!!!

  • 1 humanoid robot per unit (obviously)
  • Assume average chip ASP (average selling price) = $5–$10 (licensing/IP model, not silicon)
  • $1 trillion / $10 per robot = 100 billion chips at the low end (if one chip per robot)
  • Each robot might require 5 Akida chips (for multi-modal processing: vision, audio, sensor fusion, motor control)
  • So realistically 5 chips per robot x 7 billion robots = 35 billion chips

He-he-he! 😝


Screenshot 2025-10-31 at 3.29.00 pm.png




tenor.gif
 
Last edited:
  • Haha
  • Fire
  • Like
Reactions: 12 users

Diogenese

Top 20
  • Haha
  • Wow
  • Fire
Reactions: 8 users

Tothemoon24

Top 20
Apologies if posted , nice to see this from from Tata
IMG_1689.jpeg

Abstract​

Car accidents due to driver drowsiness is a very serious pan-world problem. At present there are few AI-based drowsiness detection and alert systems, but they are not suitable for cars due to their requirement of large memory, power, and latency and dependency on the cloud. Mammalian brain inspired Spiking neural networks, coupled with neuromorphic computing paradigm, can bring in a very low-power low-latency solution for the same. In this work, we have designed one such system and tested it on Brainchip Akida Neuromorphic hardware. We found that the system is highly accurate (92.01%) in identifying the drowsiness features of a driver. Moreover, latency and energy consumption of the system are found to be 46.5 ms/frame and as low as 16.2 mJ/frame respectively on hardware - which is pretty promising for deployment, especially in battery driven cars. In addition, we also showcase the on-chip learning capabilities of the system that let us learn new classes and subjects with minimal data and personalise our drowsiness detection system for a specific driver.

 
  • Like
  • Fire
  • Love
Reactions: 21 users

Tothemoon24

Top 20

IMG_1690.jpeg

Abstract​

While the exponential growth of the space sector and new operative concepts ask for higher spacecraft autonomy, the development of AI-assisted space systems was so far hindered by the low availability of power and energy typical of space applications. In this context, Spiking Neural Networks (SNN) are highly attractive because of their theoretically superior energy efficiency due to their inherently sparse activity induced by neurons communicating by means of binary spikes. Nevertheless, the ability of SNN to reach such efficiency on real world tasks is still to be demonstrated in practice. To evaluate the feasibility of utilizing SNN onboard spacecraft, this work presents a numerical analysis and comparison of different SNN techniques applied to scene classification for the EuroSAT dataset. Such tasks are of primary importance for space applications and constitute a valuable test case given the abundance of competitive methods available to establish a benchmark. Particular emphasis is placed on models based on temporal coding, where crucial information is encoded in the timing of neuron spikes. These models promise even greater efficiency of resulting networks, as they maximize the sparsity properties inherent in SNN. A reliable metric capable of comparing different architectures in a hardware-agnostic way is developed to establish a clear theoretical dependence between architecture parameters and the energy consumption that can be expected onboard the spacecraft. The potential of this novel method and its flexibility to describe specific hardware platforms is demonstrated by its application to predicting the energy consumption of a BrainChip Akida AKD1000 neuromorphic processor.
42064_2024_256_Fig1_HTML.jpg
 
  • Like
  • Fire
  • Love
Reactions: 17 users

Rskiff

Regular
21 billion cyberchips?

OK, then I'll raise you!

I underestimated my robotic chip numbers.

According to Elon Musk, we're talking a trillion-dollar robot army!!!!!!!!

  • 1 humanoid robot per unit (obviously)
  • Assume average chip ASP (average selling price) = $5–$10 (licensing/IP model, not silicon)
  • $1 trillion / $10 per robot = 100 billion chips at the low end (if one chip per robot)
  • Each robot might require 5 Akida chips (for multi-modal processing: vision, audio, sensor fusion, motor control)
  • So realistically 5 chips per robot x 7 billion robots = 35 billion chips

He-he-he! 😝


View attachment 92611



View attachment 92610
Hate to burst the bubble but Musk wants as a salary 1 Trillion $ from Tesla, to be able to develop and have control robots.
 
  • Wow
  • Fire
  • Sad
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Last edited:
  • Haha
  • Love
Reactions: 11 users

Rskiff

Regular
Bummer...

You've really you've really Trumped me with that one.

In fact, I think you might have just Mar-a-Lago’d my whole thesis.
Not a bad gig if you can get it. I personally hope that he doesn't get it for many reasons.
 
  • Like
Reactions: 2 users

Rskiff

Regular
Not a bad gig if you can get it. I personally hope that he doesn't get it for many reasons.
Also he has said everyone on Earth will eventually have 8 bots each......yeah right!
 
  • Like
Reactions: 1 users

Gazzafish

Regular
Looks like the Investor Relations email address is incorrect or down??
 

Attachments

  • IMG_6560.jpeg
    IMG_6560.jpeg
    216.5 KB · Views: 29
  • Sad
  • Like
Reactions: 2 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Also he has said everyone on Earth will eventually have 8 bots each......yeah right!

  • One bot to fold my fitted sheets.
  • One bot to empty my cat's litter tray.
  • One bot to laugh at my jokes.
  • One bot to massage my bunions.
  • One bot to fight my other bots when they try to unionise.
  • One bot to match all my tupperware lids to their containers
  • One bot to listen to me while I vent about the BRN share price.
  • And one bot to cut my hair.

PS: I was going to add one bot to pressure-wash my muddy drive-way, but I don't want to get a dirty botty.🤭

ClW2.gif
 
  • Haha
  • Wow
  • Love
Reactions: 14 users
  • One bot to fold my fitted sheets.
  • One bot to empty my cat's litter tray.
  • One bot to laugh at my jokes.
  • One bot to massage my bunions.
  • One bot to fight my other bots when they try to unionise.
  • One bot to match all my tupperware lids to their containers
  • One bot to listen to me while I vent about the BRN share price.
  • And one bot to cut my hair.

PS: I was going to add one bot to pressure-wash my muddy drive-way, but I don't want to get a dirty botty.🤭

View attachment 92615
1761890100758.gif

Don’t forget this bot
 
  • Haha
Reactions: 8 users
No known link however they did pop-up when asking GPT a question on first companies to bring money to brainchip next quarter. Their a Swiss company.

www.renata.com

1761891030217.png

1761891030265.png

1761891030313.png

+5


Renata is
a Swiss company, a subsidiary of the Swatch Group, that specializes in producing batteries and other electronic components. While they don't sell wearable devices themselves, their high-quality batteries and other components are used in a wide variety of wearable and connected products, including smartwatches, medical devices, and sensors.



Renata, a manufacturer of micro-batteries, would likely leverage BrainChip's Akida neuromorphic processor for on-device artificial intelligence (AI) capabilities within their battery-powered devices.[1] This would enable features such as predictive maintenance, enhanced power management, and intelligent sensor data processing directly at the edge, without constant cloud connectivity.[2] For instance, Akida could analyze battery discharge patterns to predict end-of-life more accurately or optimize power consumption for specific applications running on Renata's batteries.[3]
 
Last edited:
  • Love
  • Like
  • Thinking
Reactions: 6 users
And let’s not forget my favourite bot the drunk bot

1761891520383.gif
 
  • Haha
Reactions: 9 users

manny100

Top 20
The recent Parsons news was good there is also expectations that those trialing for a number of years come up soon.
We also see the longer term bollinger band (20 week) width (red verticals below) at pretty close if not lower than historic levels.
We also see the Bollinger band as tight as....
Ask your chat box what this could result in.
Then ask the same questions ie, very low 20 week bollinger band width mentioning the good news of late and expectations of more.
The 20 Week Bollinger bands are a longish term indicator.
DYOR.
BRN WK.png
 
  • Like
  • Love
Reactions: 9 users

manny100

Top 20
Brainchip gets a mention below concerning SONY. It looks more like they benchmark against us and intend to go with their own?????
Its not clear.
Read the link. It could all be crap anyway. Are they suggesting that they are better than AKIDA? As far as i know Sony has not 'invented' their own Neuromorphic Edge AI with on chip learning.
Certainly if Sony's chip needs a brain it needs AKIDA.

"More on the Technology

Sony’s Event-Based Vision Sensors (EVS) deliver <1 µs latency and <10 mW power, processing only dynamic events for 10x efficiency over traditional CMOS sensors, ideal for high-speed applications like autonomous driving and robotics. The IMX500 AI chip provides 40 TOPS, enabling on-device intelligence with 5x lower power than GPUs for tasks like object detection. Compared to BrainChip’s Akida (40 TOPS/W), Sony’s EVS excels in vision-specific tasks, achieving 1,000 fps equivalent processing. SPAD sensors enhance low-light performance, detecting single photons for neuromorphic vision, making them suitable for healthcare imaging and industrial automation."
 
Last edited:
  • Thinking
  • Like
  • Wow
Reactions: 8 users
  • One bot to fold my fitted sheets.
  • One bot to empty my cat's litter tray.
  • One bot to laugh at my jokes.
  • One bot to massage my bunions.
  • One bot to fight my other bots when they try to unionise.
  • One bot to match all my tupperware lids to their containers
  • One bot to listen to me while I vent about the BRN share price.
  • And one bot to cut my hair.

PS: I was going to add one bot to pressure-wash my muddy drive-way, but I don't want to get a dirty botty.🤭

View attachment 92615
What about one bot to take off your elf boots and clean them.

SC
 
  • Haha
Reactions: 3 users

7für7

Top 20
Brainchip gets a mention below concerning SONY. It looks more like they benchmark against us and intend to go with their own?????
Its not clear.
Read the link. It could all be crap anyway. Are they suggesting that they are better than AKIDA? As far as i know Sony has not 'invented' their own Neuromorphic Edge AI with on chip learning.
Certainly if Sony's chip needs a brain it needs AKIDA.

"More on the Technology

Sony’s Event-Based Vision Sensors (EVS) deliver <1 µs latency and <10 mW power, processing only dynamic events for 10x efficiency over traditional CMOS sensors, ideal for high-speed applications like autonomous driving and robotics. The IMX500 AI chip provides 40 TOPS, enabling on-device intelligence with 5x lower power than GPUs for tasks like object detection. Compared to BrainChip’s Akida (40 TOPS/W), Sony’s EVS excels in vision-specific tasks, achieving 1,000 fps equivalent processing. SPAD sensors enhance low-light performance, detecting single photons for neuromorphic vision, making them suitable for healthcare imaging and industrial automation."


Yeah great 😂😂
We went from “Sony invested in Brainchip” to “Sony is better than BrainChip”
 
  • Haha
  • Like
Reactions: 3 users

Diogenese

Top 20
Brainchip gets a mention below concerning SONY. It looks more like they benchmark against us and intend to go with their own?????
Its not clear.
Read the link. It could all be crap anyway. Are they suggesting that they are better than AKIDA? As far as i know Sony has not 'invented' their own Neuromorphic Edge AI with on chip learning.
Certainly if Sony's chip needs a brain it needs AKIDA.

"More on the Technology

Sony’s Event-Based Vision Sensors (EVS) deliver <1 µs latency and <10 mW power, processing only dynamic events for 10x efficiency over traditional CMOS sensors, ideal for high-speed applications like autonomous driving and robotics. The IMX500 AI chip provides 40 TOPS, enabling on-device intelligence with 5x lower power than GPUs for tasks like object detection. Compared to BrainChip’s Akida (40 TOPS/W), Sony’s EVS excels in vision-specific tasks, achieving 1,000 fps equivalent processing. SPAD sensors enhance low-light performance, detecting single photons for neuromorphic vision, making them suitable for healthcare imaging and industrial automation."

This is the Sony patent for layered pixel/processor (No mention of Akida):

US2024089577A1 IMAGING DEVICE, IMAGING SYSTEM, IMAGING METHOD, AND COMPUTER PROGRAM 20210129

1761899825462.png


An imaging device includes: an imaging section that has a pixel region where a plurality of pixels is arrayed, a readout unit control section that controls readout units each set as a part of the pixel region, a readout control section that controls readout of pixel signals from the pixels included in the pixel region for each of the readout units set by the readout unit control section, a recognition section that has a machine learning model trained on the basis of leaning data, and a determination basis calculation section that calculates a determination basis of a recognition process performed by the recognition section. The recognition section performs the recognition process for each of the readout units. The determination basis calculation section calculates a determination basis for each of the readout units.

[0136] Described in Paragraph C herein will be an outline of a recognition process using a DNN (Deep Neural Network) applicable to the present disclosure. It is assumed in the present disclosure that a recognition process for image data (hereinafter simply referred to as an “image recognition process”) is performed using a CNN (Convolutional Neural Network) and an RNN (Recurrent Neural Network) included in the DNN
.
 
  • Like
  • Sad
Reactions: 2 users
Top Bottom