BRN Discussion Ongoing

CHIPS

Regular
1761489834237.png


Something fascinating is happening inside the world’s data centers.
For more than a decade, GPUs have powered the AI revolution.

But as models grow to trillions of parameters and power costs surge, we’re hitting physical limits — and the race is on to reimagine what computing hardware should look like.

What’s emerging is nothing short of a quiet revolution — new architectures that might finally challenge the GPU’s monopoly on AI compute.



Cerebras Systems

Building wafer-scale engines — chips the size of dinner plates that can train massive models without the pain of GPU clusters. One chip, one model, no fragmentation.

Groq

Taking a radical approach with a deterministic Language Processing Unit (LPU) designed for ultra-low-latency inference. Already powering sovereign AI projects in the Middle East.

Graphcore & SambaNova

Re-architecting dataflow itself — bringing compute and memory closer together for huge efficiency gains in model training.

Lightmatter

Computing with light instead of electricity.
Its photonic chips use optical interconnects to link processors at fiber-optic speeds — a potential game-changer for hyperscale AI clusters.

BrainChip

Taking inspiration from biology, with neuromorphic chips that mimic spiking neurons to run AI at milliwatt power levels — perfect for edge AI.


Etched

Going all-in on specialization — a transformer-only ASIC that, if its claims hold, could replace hundreds of GPUs for a fraction of the cost.



Each of these players is betting on a different principle —
wafer-scale integration, deterministic execution, optical communication, or brain-like computation —
but they share one goal: break free from the GPU bottleneck.

It’s becoming clear that the future of AI compute will be heterogeneous.
GPUs will stay the workhorses, but they’ll soon be joined by optical, neuromorphic, and specialized accelerators — turning tomorrow’s data centers into orchestras of silicon.

1761489919815.png


1761490049159.png



 
  • Like
  • Fire
  • Love
Reactions: 15 users
The Silicambrian Explosion

When BrainChip (BRN) suddenly hits mass adoption, it will spark a "Silicambrian Explosion"—a silicon/IP-fueled burst of neuromorphic innovation, mirroring the ancient Cambrian Explosion's rapid diversification of life, but this time evolving edge AI from niche experiments to everywhere-embedded intelligence. Just as the original event birthed complex multicellular life (including early brains), BrainChip's Akida chips and IP will explode into trillions of low-power, brain-like processors across IoT, autos, and beyond, turning sci-fi into silicon reality.

Buckle up: Darwin would approve of this evolutionary upgrade.
1761506072477.gif
 
  • Haha
  • Like
Reactions: 14 users

Frangipani

Top 20
Thank you @Frangipani , but I do not understand the results.
Can you explain further, or is there a page missing? The last page, "Experimental Results", does not state which chip it is.

Guten Abend, CHIPS,

the abbreviations DPU, TPU and NPU refer to three different hardware platforms, cf. the preceding presentation slide titled “AI Model Overview” or the following excerpt from the May 2025 post of mine I had tagged above:

https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-462394

“The revised GIASaaS (Global Instant Satellite as a Service, formerly Greek Infrastructure for Satellite as a Service) concept website by OHB Hellas is finally back online - and Akida now no longer shows up as UAV1 (see my post above) but as one of three SATs! 🛰

AKIDA, the envisioned SAT with the AKD1000 PCIe Card onboard, is slated to handle both Object Detection as well as Satellite Detection, while the planned KRIA and CORAL satellites (equipped with a Xilinx KRIA KV260 Vision AI Starter Kit resp. a Google Coral TPU Dev Board) are tasked to handle Vessel Detection, Cloud Detection and Fire Detection (for some reason OHB Hellas removed Flood Detection from the list of applications).

Please note that this is currently a concept website only.”


Flood detection was originally also slated to be tasked by both KRIA (DPU) and CORAL (TPU) (see here as well as under Experimental Results), but is no longer listed as a choosable application on the updated GIASaaS website https://giasaas.eu/.


Generally speaking, it would of course be helpful to also have the video recordings to go along with these conference presentations…
 
  • Like
  • Fire
  • Love
Reactions: 7 users

7für7

Top 20
Good morning chips!

Will we finally start to rise or will we continue to ride the same rollercoaster ?

Pass Out Mr Bean GIF
 
  • Like
  • Haha
Reactions: 5 users

FJ-215

Regular
Good morning chips!

Will we finally start to rise or will we continue to ride the same rollercoaster ?

Pass Out Mr Bean GIF
Refreshing to see some green this morning.
Hoping it continues but without an announcement with revenue I think we are defying gravity.
Come on Sean......59 days to Christmas!!!
 
  • Like
Reactions: 2 users

HarryCool1

Regular
@Esq.111 is it time to get a little more excited about the buy side increase or am I just interpreting my organic herbal tea leaves wrong??
 
  • Haha
  • Like
Reactions: 3 users

7für7

Top 20
@Esq.111 is it time to get a little more excited about the buy side increase or am I just interpreting my organic herbal tea leaves wrong??

Judging by all the posts shared over the weekend…each one sounding like at least 10 price sensitive announcements …then we should easily hit a dollar this week 🤪 … and that’s without even drinking any herbal tea!
 
  • Like
Reactions: 1 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Following on from the previous post on Anduril's EagleEye headset, if you take a look back at the BrainCHip Technology Roadmap video (below) Jonathan Tapson makes the following comments.


26.28 mins

"The next step is to integrate some cameras into the headset, which will allow it to interpret the scene both in front of and behind the person. You may have seen movies where special forces operate silently and use hand signals to communicate. I mentioned to one of our associates in this area that there’s no reason not to recognise hand gestures made by someone behind you, so you can literally have eyes in the back of your head. They were actually speechless at that possibility because apparently that’s a huge problem when operating silently; you can’t say, ‘Hey, look over here,’ but you still need to be able to signal to each other. That idea was apparently just mind-blowing, and you can see how we can grow the solution from that point.”



So I was very interested to have stumbled over this Instagram pic in my research this morning. 🧐



View attachment 92216





And this...


View attachment 92217

View attachment 92214





More on Anduril's EagleEye helmet as per the above posts.

Over the weekend I watched the BrainChip Technology Roadmap presentation again. There was quite a lot of information that I hadn't really registered or fully appreciated upon my first preview of it.



25.36 mins - Jonathan Tapson discussing Akida 3.0

"So this is an outcome that we've actually proposed to the US defense industry and there's a very high probability of us getting a very positive outcome there. So every soldier now wears a headset and it includes a radio and basic hearing protection. And we actually want to take that headset to the next level. So the things we can already do; we can clear up speech and noise extremely well and we can already answer simple questions, so questions like I have a casualty who is bleeding form the head - what do I do? It's very useful if the soldier can just say that into the headset and get some kind of helpful answer straight away. Or how do I get this vehicle that I've never driven before into gear and moving. It's able to actually give those kind of responses. AND WE HAVE THOSE PARTS WORKING ON FPGA FOR DEMONSTRATION ALREADY."


So, the parts were already working on FPGA for demonstration and the slide from the presentation shows that AKD 3 FPGA is due Q2 2026.


Screenshot 2025-10-27 at 10.05.00 am.png







Bearing this timing in mind, this Defense Scoop article (see below) says " Anduril Industries will deliver roughly 100 units of its new AI-powered helmet and digitally-enhanced eyewear system — EagleEye — to select U.S. Army personnel during the second quarter of the upcoming calendar year, the company’s founder told reporters."





Screenshot 2025-10-27 at 10.17.01 am.png







Seems like 100 units would be samples only, like a small engineering run which would be typically used to evaluate new sensor, compute or firmware combinations.

So, my point being, the Akida 3.0 FPGA parts are working for demonstration right now, and the Akida-3 FPGA due date aligns with the Q2 2026 outlined for the delivery of Anduril's samples.

Is this just remarkably co-incidental timing or something else?

Sounds like perfect timing for a funded prototype that feeds into a 2026–27 soldier evaluation followed by production if tests succeed.

Still speculation at this stage obviously, as BrainChip hasn't been public named as a partner, but Luckey told reporters reporters that Anduril "plans to announce more partners over the next year". So, we'll just have to keep an eye out for any new announcements to see if anything solid is confirmed.
 
Last edited:
  • Like
  • Fire
  • Thinking
Reactions: 8 users
Top Bottom