BRN Discussion Ongoing

7für7

Top 20
What I honestly don’t understand is this: What are they afraid of?

They call the Cloud system a milestone, yet they can’t manage to announce it properly through the ASX.
Where’s the problem? Is there something behind this that we don’t know about?
Is there a reason why they don’t officially share this milestone?

Or are they deliberately trying to prevent another hype?

To me, it makes no sense – and if anyone from BrainChip is reading this: Please give us a plausible answer without the usual platitudes.
Thank you.

MILESTONE!!! HELLOOOO!?!?
Michael J Fox Hello GIF by Back to the Future Trilogy
 
  • Like
  • Thinking
  • Fire
Reactions: 10 users
  • Like
  • Fire
  • Wow
Reactions: 28 users

7für7

Top 20
@7für7 Is there a reason?

Tony Lewis likes this LinkedIn post. Coincidence the Akida cloud is released or is it just good timing?

View attachment 89494

Be nice to think Brainchip can join this national network of AI cloud labs. Nice earner if they can.


https://www.linkedin.com/posts/nati...urce=social_share_send&utm_campaign=copy_link
Interesting… BUT…
If BrainChip is involved in any way and there’s already a deal in place that is expected to generate significant revenue, there must definitely be a price-sensitive announcement.

Since this announcement has already been made public by them, there is no reason to assume that an NDA exists.

Edit…. COULD IT BE??
 
  • Like
Reactions: 4 users

MegaportX

Regular
1754894758812.png

1754894828988.png


🤔
 
  • Like
  • Sad
  • Haha
Reactions: 6 users

Rach2512

Regular
  • Like
  • Thinking
Reactions: 6 users

7für7

Top 20
My prediction was absolutely right – I’m always right, everybody knows it! This morning, the chip calls me… and they say, Donald, please, please, give us a little more time. We don’t want to put out tiny, pathetic little news. No, no small news! We’re talking about something BIG.

And I said – and this is true, everybody knows it – CHIP… I said..Take it. Take it and make it!

Look at this… This is going to be HUUUUUGE! Bigger than anything you’ve ever seen. People are going to say, “Donald, how did you do it again?” And I’ll say, “Very simple I DID It”


Donald Trump GIF
 
  • Haha
Reactions: 6 users

Easytiger

Regular
Morning Easytiger ,

Not that I'm aware of , iv only seen the three still photos which were attached to the original LinkedIn post.

Regards,
Esq.
Would have thought this story would have got prime placement in the monthly BRN newsletter.
 
  • Like
  • Fire
Reactions: 2 users

Rach2512

Regular


Screenshot_20250811_162509_Samsung Internet.jpg
Screenshot_20250811_162524_Samsung Internet.jpg
 
  • Like
  • Thinking
  • Love
Reactions: 14 users

Diogenese

Top 20
What I honestly don’t understand is this: What are they afraid of?

They call the Cloud system a milestone, yet they can’t manage to announce it properly through the ASX.
Where’s the problem? Is there something behind this that we don’t know about?
Is there a reason why they don’t officially share this milestone?

Or are they deliberately trying to prevent another hype?

To me, it makes no sense – and if anyone from BrainChip is reading this: Please give us a plausible answer without the usual platitudes.
Thank you.

MILESTONE!!! HELLOOOO!?!?
Michael J Fox Hello GIF by Back to the Future Trilogy
I think we're in the Epstein files ...
 
  • Haha
  • Like
Reactions: 17 users

Diogenese

Top 20
Elon's new motto:

"Move fast and fall flat on your face!"
 
  • Haha
Reactions: 10 users

Terroni2105

Founding Member
My prediction was absolutely right – I’m always right, everybody knows it! This morning, the chip calls me… and they say, Donald, please, please, give us a little more time. We don’t want to put out tiny, pathetic little news. No, no small news! We’re talking about something BIG.

And I said – and this is true, everybody knows it – CHIP… I said..Take it. Take it and make it!

Look at this… This is going to be HUUUUUGE! Bigger than anything you’ve ever seen. People are going to say, “Donald, how did you do it again?” And I’ll say, “Very simple I DID It”


Donald Trump GIF
I want Brainchip to be sick of winning
 
  • Haha
  • Love
  • Like
Reactions: 11 users
Interesting… BUT…
If BrainChip is involved in any way and there’s already a deal in place that is expected to generate significant revenue, there must definitely be a price-sensitive announcement.

Since this announcement has already been made public by them, there is no reason to assume that an NDA exists.

Edit…. COULD IT BE??
And the timing of this is coincidental with both announcements. 😎🫣🥸

https://www.linkedin.com/posts/brai...s&rcm=ACoAABr1PUAB8xoHGlYCgh2Xjp9DaZYo5hN0voU
IMG_5464.png
 

Attachments

  • IMG_5464.png
    IMG_5464.png
    2.1 MB · Views: 14
  • Like
  • Fire
Reactions: 9 users

manny100

Top 20
This next article (see below) titled “Free, offline ChatGPT on your phone? Technically possible, basically useless”, describes how running the new ChatGPT-oss on smartphones is technically feasible, but current smartphones aren’t powerful enough to make it practical at this point in time.

Right now, GPT-oss on mobile is bottlenecked because it runs purely on the CPU (and sometimes GPU) without tapping into specialized neural processing hardware.

So, I asked ChatGPT 5.0 if there'd be an improvement if the model were to be adapted to run on an NPU - particularly one optimized for event-based, sparse processing like Akida and here's the response I received.

I highlighted the last point for obvious reasons.






ChatGPT 5's Response:

Yes — Akida could absolutely be a technology to consider if the goal is to make something like “ChatGPTOS” actually usable on a mobile phone, and the limitations described in that article are almost a perfect setup for why a neuromorphic processor would matter.

Here’s why:



Why Akida Fits This Problem

The main bottlenecks in running a 20B parameter LLM on a phone (as described in the article) are:
  1. RAM requirements – Needs 12–24 GB just to load the model.
  2. Compute requirements – Mobile CPUs & GPUs can’t sustain high token generation without overheating and throttling.
  3. Power draw – 100% CPU load drains the battery extremely fast.
  4. Heat – The phone becomes uncomfortably hot after just a few minutes.
Akida (or any high-efficiency neuromorphic processor) is built for:

  • Event-driven, sparse compute — only processes relevant “spikes” of information instead of hammering every computation step like a CPU/GPU.
  • Ultra-low power draw — operates in milliwatts for inference.
  • On-chip learning & inference — avoids constant memory transfers that waste time and energy.
  • Massive parallelism — can run edge AI workloads in real time without overheating.
That means for the same LLM inference task, a neuromorphic setup could:
  • Reduce energy consumption by orders of magnitude.
  • Maintain higher sustained token rates without thermal throttling.
  • Extend usable session time from minutes to hours on a battery.


Why Not Just Use Mobile NPUs Instead?

Modern NPUs (like Qualcomm Hexagon or Apple Neural Engine) do accelerate AI workloads, but:
  • They’re matrix-math optimised, not spiking-event optimised.
  • They often require proprietary SDKs and retraining models in special formats.
  • They still hit thermal limits in sustained generative AI workloads.
Neuromorphic processors like Akida:
  • Can run quantised, sparse, and pruned models natively without expensive data shuffling.
  • Could theoretically run a distilled / specialised LLM that fits in a fraction of the RAM and still delivers conversational quality.


Would It Make ChatGPTOSS Work on Phones?

Realistically:
  • You’re not going to run the full 20B gpt-oss on Akida right now — it’s still too large.
  • But with model compression (distillation + sparsity), you could run a smaller, reasoning-optimised model that feels far more responsive than the brute-force CPU approach.
  • Instead of the “2–3 tokens per second” in the article, you might see 10× or more throughput without cooking the phone.


Implications for BrainChip

If OpenAI or others seriously want “ChatGPTOS” to run locally:
  • They’ll either need dedicated AI accelerators in consumer devices or move toward neuromorphic architectures.
  • Qualcomm, Apple, Google already have NPUs — but neuromorphic could leapfrog them in power efficiency per watt.
  • If BrainChip could prove Akida can handle a slimmed-down LLM at useful speeds, it becomes a licensable IP block for smartphone SoCs.
  • This would be a huge market — billions of devices, not just niche edge AI.
BrainChip’s Akida Pico is already being explored for LLM acceleration at the edge, with real-time processing and ultra-low power consumption. 9 month old video ' starring' Todd. No doubt we have made further advances over the past 9 months - we have the 'want to do it' incentive.

PICO can handle 'specialist' areas of the web and disregards the absolute mountains of data not relevant to your special needs. Eg Brainchip had appliance repair manuals.
Whether Pico can be adapted for cybersecurity in mobiles which has been identified as a weak spot vulnerable to hacking networks.? Perhaps even diagnostic and repair suggestions for the phone itself?
Perhaps information suitable for your trade or profession?
Has potential for sure.
Title:

AI at the Edge: How BrainChip’s Akida Pico Takes on the Cloud with LLMs!​

 
  • Like
Reactions: 9 users

The Pope

Regular
@7für7 Is there a reason?

Tony Lewis likes this LinkedIn post. Coincidence the Akida cloud is released or is it just good timing?

View attachment 89494

Be nice to think Brainchip can join this national network of AI cloud labs. Nice earner if they can.


https://www.linkedin.com/posts/nati...urce=social_share_send&utm_campaign=copy_link
Purely coincidental. Yeah nah. lol
So much BS going on here and on the crapper with some clicky group shite thrown in (happy clappers and negative Nancies). All amusing. Hope all had a good day.
 
  • Like
  • Haha
Reactions: 6 users

The Pope

Regular
Looks like a normal day of trading for BRN on ASX. Appears at least on the surface there is so much dodgy shite going on but hopefully BRN SP punches out the other side with some solid upswing. Still laughing at a very recent MF post by Dickleboro (last week) but no mention of the last 4C with uptick in revenue etc. If the revenue was only as much as a coffee shop he would have been all over it like a 🪰 to 💩
 
  • Like
  • Love
  • Fire
Reactions: 7 users

The Pope

Regular
Would have thought this story would have got prime placement in the monthly BRN newsletter.
Unfortunately no as they have interns / trainees doing the official BRN newsletters. 🤪
 
Last edited:
  • Fire
Reactions: 2 users

IloveLamp

Top 20
1000009870.jpg
1000009867.jpg
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 9 users
Hey Guys and gals
Amazing communication from the chip I have opened my emails for today and just received the July newsletter from the brain
Now we are already in august and it’s the 11 th at this moment not July and I doubt that it’s taken them that long to write the newsletter
Now I am in the top 200 investors and for me that really is piss poor efforts on the company side
For fuck sake Sean wake the fuck up and do better than other companies
 
  • Like
  • Haha
  • Love
Reactions: 6 users

7für7

Top 20
Purely coincidental. Yeah nah. lol
So much BS going on here and on the crapper with some clicky group shite thrown in (happy clappers and negative Nancies). All amusing. Hope all had a good day.
Donald Trump GIF
 

Frangipani

Top 20


D1236A7D-8DD7-4786-8126-3B63BC0F24A5.jpeg
9826B791-D9C4-460C-94D4-F63E2C8D2F5B.jpeg




80458E8E-078C-4344-8923-D697E6095BB3.jpeg
46C36E87-6EF4-4DA2-8D2D-2C96761A3084.jpeg






Not sure whether the info about the alleged autonomous Russian UAV is correct, though - those photos shared are verifiably not images from Russia, but show a project by Andrew Bernas recently featured on Hackster.io:



D093CA54-E071-4A55-81A5-7DBFDB888B36.jpeg




33EA0197-4CDB-40B9-8D22-518BB4B1672D.jpeg


(…)
5569444F-B4B2-43F4-BB31-46E0E02228FE.jpeg


(…)

62B41C51-72D9-4DFB-9110-420487F58C72.jpeg




The comment section under Jordan Linn’s initial post also begs the question whether re-poster Roberto Lafforgue had actually been aware of “neuromorphic hardware as that developed by #BrainChip” prior to reading the comment of someone whom I believe to be a BrainChip shareholder:



8318C0F0-96A0-4FF8-B5FA-0C342D79F843.jpeg


Well, at least Roberto Lafforgue is aware now, and so are the people in his LinkedIn network…

Yet, it is a good example that not every mention of our technology on LinkedIn should be given the same weight.

I generally prefer to hear from people with verifiable hands-on experience...
 
Last edited:
  • Fire
  • Thinking
  • Like
Reactions: 6 users
Top Bottom