BRN Discussion Ongoing

🤣
So I guess the author is saying BrainChip doesn't have a credible path, to profitability..

We've got a path to that.. And some! 😛

Nobody can say for sure, how long that will be, but to say it's not there, considering the partners we have and just the Ecosystem, that we have developed to this point, is just plain ignorance.
Atlassian is still losing over $600 million US yet has a market cap of 68 billion US
 
  • Like
  • Fire
Reactions: 5 users

Potato

Regular
1659875197019.png
 
  • Haha
  • Like
Reactions: 28 users

Townyj

Ermahgerd

View attachment 13532

In order to meet the Navy’s need for a spiking neural network testing platform, ChromoLogic proposes to develop a Spiking Neural Network Modeler (SpiNNMo) capable of simulating a variety of neuromorphic hardware platforms. SpiNNMo is able to extract relevant performance parameters from a neuromorphic chip and then predict the chip’s performance on new networks and data. In this way SpiNNMo can predict accuracy, latency and energy usage for a wide variety of hardware platforms on a given neural network and dataset. This will allow the Navy to test the performance of new spiking neural network architectures and chipsets before the chips are widely available and therefore speed neuromorphic adoption.


I wonder if they are talking about Akida 2000 or even 3000


Ok... This is interesting.. The company doing the Neural Network Modeler are involved with Cyber Security and Biomedical.


2022-08-07_20-30-29.jpg
 
  • Like
  • Fire
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
  • Love
  • Like
  • Haha
Reactions: 11 users

robsmark

Regular
This is true. It can only go right!
So long as it goes right to $20 a share…
 
  • Like
  • Haha
  • Fire
Reactions: 12 users
Atlassian is still losing over $600 million US yet has a market cap of 68 billion US
Just checked out their stats..
Alot of big numbers, but still burning big money too.

Man.. I don't think a lot of us, even those with lofty targets, are going to be prepared for where we're headed..

We're going to take ARM's IP revenue model and show the World, what it can really do.

Like Diogenese said, we will stand on the shoulders, of those Tech Companies, that have laid the foundations for our Dominance, of the Edge A.I. Markets.

Okay, okay.. Barf bags are in aisle 3, for those that need them...
 
  • Like
  • Fire
  • Haha
Reactions: 37 users

Xhosa12345

Regular

My fave song
So many sausage fests running companies
And also look at the world and why it sux. Sausage after sausage ..
I have a sausage, but the world needs sausage less leaders. And the sausageless need to stand up and dont be so dominated .. its slowly happening, very slowly unfortunately
 
  • Like
  • Thinking
  • Wow
Reactions: 9 users

goodvibes

Regular

JWPM CONSULTING​

BrainChip ASX: Akida neuromorphic processor. AI at the edge.​



Dont know if already posted…but really a very good article about Brainchip and its future…

Akida™ will soon be the Model-T of neuromorphic processors​

 
  • Like
  • Love
  • Fire
Reactions: 52 users

JWPM CONSULTING​

BrainChip ASX: Akida neuromorphic processor. AI at the edge.​



Dont know if already posted…but really a very good article about Brainchip and its future…

Akida™ will soon be the Model-T of neuromorphic processors​

It should have been posted before, it's over 2 years old! 😛

But never seen it and it really is a great article, quite indepth..

Thanks for posting 👍
 
  • Like
  • Fire
Reactions: 17 users

Dhm

Regular

JWPM CONSULTING​

BrainChip ASX: Akida neuromorphic processor. AI at the edge.​



Dont know if already posted…but really a very good article about Brainchip and its future…

Akida™ will soon be the Model-T of neuromorphic processors​

It is a great article, in spite of its age. But what motivated JWPM Consulting to do such an in-depth article in the first place?
 
Last edited:
  • Like
Reactions: 7 users

Nantrix

Emerged
  • Like
  • Fire
  • Love
Reactions: 10 users
D

Deleted member 118

Guest
 
  • Like
  • Haha
Reactions: 6 users

MDhere

Top 20
Ooh La La..
lol
Apologies if already posted
i like it , I like it alot, thanks for the post Nantrix. Though the bit about would make an attractive takeover canditate by a global player can hold its horses a bit, aint no takevover in single $ digits try high double $ digits.
But i do like the bit about - It is no coincidence that the Akida chip was recently accepted into the AI Partner Program by ARM. Mercedes-Benz also relies on the novel technology for its EQXX. Nice article and good positive blog.
 
  • Like
  • Love
  • Haha
Reactions: 14 users

stuart888

Regular
Lots of smart people here, thanks for helping us newbies at the code level. I have been trying to wrap my brain around the Renesas 2-node license. Dio might have mentioned 2 nodes and 8 NPUs. Perhaps the term node is not working for me. The 8 NPUs is Neural Processing Units I assume. The nodes, layers, NPUs kind of get foggy for those of us javascript/sql programmers!

Brainchip's big advantage is tangled in the SNN architecture. I also realize that the output of Akida often is to another receiving function. Meaning we are part of a solution, in a chain to the final outcome of message to the user. Like Nasa for example, Akida after being dormmate (non-spiked! non-threshold!) for months, awakes to send a signal that an asteroid is approaching and might fire up the energy usage next step.

The more everyone talks about the architecture details the better in my opinion. Layers, nodes, activation functions, all that. Keep it coming, very helpful.


1659904692833.png
 
  • Like
  • Fire
  • Wow
Reactions: 15 users

uiux

Regular
Lots of smart people here, thanks for helping us newbies at the code level. I have been trying to wrap my brain around the Renesas 2-node license. Dio might have mentioned 2 nodes and 8 NPUs. Perhaps the term node is not working for me. The 8 NPUs is Neural Processing Units I assume. The nodes, layers, NPUs kind of get foggy for those of us javascript/sql programmers!

Brainchip's big advantage is tangled in the SNN architecture. I also realize that the output of Akida often is to another receiving function. Meaning we are part of a solution, in a chain to the final outcome of message to the user. Like Nasa for example, Akida after being dormmate (non-spiked! non-threshold!) for months, awakes to send a signal that an asteroid is approaching and might fire up the energy usage next step.

The more everyone talks about the architecture details the better in my opinion. Layers, nodes, activation functions, all that. Keep it coming, very helpful.


View attachment 13563


Hi Stuart if you are a coder you might appreciate the working examples I've put together using Akida:



To understand layers of the network, this is an example of the edge learning enabled mobilenet model:

Model Summary ________________________________________________ Input shape Output shape Sequences Layers ================================================ [224, 224, 3] [1, 1, 10] 1 16 ________________________________________________ SW/conv_0-akida_edge_layer (Software) ________________________________________________________________ Layer (type) Output shape Kernel shape ================================================================ conv_0 (InputConv.) [112, 112, 16] (3, 3, 3, 16) ________________________________________________________________ separable_1 (Sep.Conv.) [112, 112, 32] (3, 3, 16, 1) ________________________________________________________________ (1, 1, 16, 32) ________________________________________________________________ separable_2 (Sep.Conv.) [56, 56, 64] (3, 3, 32, 1) ________________________________________________________________ (1, 1, 32, 64) ________________________________________________________________ separable_3 (Sep.Conv.) [56, 56, 64] (3, 3, 64, 1) ________________________________________________________________ (1, 1, 64, 64) ________________________________________________________________ separable_4 (Sep.Conv.) [28, 28, 128] (3, 3, 64, 1) ________________________________________________________________ (1, 1, 64, 128) ________________________________________________________________ separable_5 (Sep.Conv.) [28, 28, 128] (3, 3, 128, 1) ________________________________________________________________ (1, 1, 128, 128) ________________________________________________________________ separable_6 (Sep.Conv.) [14, 14, 256] (3, 3, 128, 1) ________________________________________________________________ (1, 1, 128, 256) ________________________________________________________________ separable_7 (Sep.Conv.) [14, 14, 256] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 256) ________________________________________________________________ separable_8 (Sep.Conv.) [14, 14, 256] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 256) ________________________________________________________________ separable_9 (Sep.Conv.) [14, 14, 256] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 256) ________________________________________________________________ separable_10 (Sep.Conv.) [14, 14, 256] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 256) ________________________________________________________________ separable_11 (Sep.Conv.) [14, 14, 256] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 256) ________________________________________________________________ separable_12 (Sep.Conv.) [7, 7, 512] (3, 3, 256, 1) ________________________________________________________________ (1, 1, 256, 512) ________________________________________________________________ separable_13 (Sep.Conv.) [1, 1, 512] (3, 3, 512, 1) ________________________________________________________________ (1, 1, 512, 512) ________________________________________________________________ spike_generator (Sep.Conv.) [1, 1, 2048] (3, 3, 512, 1) ________________________________________________________________ (1, 1, 512, 2048) ________________________________________________________________ akida_edge_layer (Fully.) [1, 1, 10] (1, 1, 2048, 10) ________________________________________________________________ Learning Summary ____________________________________________ Learning Layer # Input Conn. # Weights ============================================ akida_edge_layer 2048 350 ____________________________________________
 
  • Like
  • Wow
  • Fire
Reactions: 33 users

stuart888

Regular
Higher level thinking about the architecture of the SNN model. No Loops? I was sitting over thing no loops computing, wow, that is the best architecture ever for a zillion use cases. Go spike go.

Ain't no loops. Spikes = No loops = Energy Efficiency = Everyone Wants it!

There are use case for loops! Akida = no loops? Very interesting is this Brainchip, has me captured.
 
  • Like
  • Fire
  • Love
Reactions: 25 users

Slade

Top 20
001 If akida go to 003
002 Abort and seek help from team brn
003 You win proceed with confidence
004 ……
 
  • Like
  • Haha
  • Thinking
Reactions: 17 users

Deadpool

Did someone say KFC
Not a lot of content this morning, is every one in awe of Uiux & Stuart888 vast intellect?👨‍🏫🧠o_O
crickets GIF
 
  • Haha
  • Like
Reactions: 15 users

jtardif999

Regular

JWPM CONSULTING​

BrainChip ASX: Akida neuromorphic processor. AI at the edge.​



Dont know if already posted…but really a very good article about Brainchip and its future…

Akida™ will soon be the Model-T of neuromorphic processors​

The article is from July 2020. I particularly like the following paragraph:
“Brainchip's first chip is now in production and technology company's have been working on applications for over 12 months.”
Akida IP was released in May 2019 so basically some (perhaps many) of our early access partners have been working on applications for more than 3 years now. Time to market for ARM customers turning designs into products has been described as about a 3 to 4 years. Seems to me things should start to explode over the next 12 months for Akida related products. AIMO.
 
  • Like
  • Fire
  • Love
Reactions: 24 users

AusEire

Founding Member.
No one here this morning
Confused Wile E Coyote GIF by Looney Tunes
 
  • Haha
  • Like
  • Love
Reactions: 24 users
Top Bottom