BRN Discussion Ongoing

IloveLamp

Top 20
Screenshot_20230703_010020_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 26 users

IloveLamp

Top 20
Screenshot_20230703_010329_LinkedIn.jpg
 
  • Like
  • Fire
Reactions: 12 users

schuey

Regular
 
  • Like
Reactions: 2 users

equanimous

Norse clairvoyant shapeshifter goddess
Good Morning Chippers,

Just looking , or should say trying , to look at Brainchip's twitter page...

Pops up with , Brainchip tapes out Akida1500 article , then locks up and disappears ?????.

I am not a twitter subscriber hence cannot login to see article.

Could get a little spicey today.

🍾 possibly for breakfast.

😃.

* If a savvy Chipper who has a Twitter account could have a quick look to confirm or not would be appreciated, if it is incorrect I shall delete this post asap, but the champagnes staying in the fridge.


Regards,
Esq.
Hi ESQ,

Just checked but nothing there
 
  • Like
  • Fire
Reactions: 5 users

Esq.111

Fascinatingly Intuitive.
Hi ESQ,

Just checked but nothing there
Morning Equanimous,

Cheers for checking.

Normally I can speak into my phone , Brainchip Twitter, and it takes me directly to our twitter page where I can paruse to a limited degree.

Today for whatever reason this article popped up first & when clicked on it went to twitter then kept locking up.

Thanks once again, I'll leave the sleuthing to the professionals.

Regards,
Esq.
 
  • Like
  • Thinking
  • Fire
Reactions: 5 users

Shadow59

Regular
We are now in Q3 2023.

"BrainChip Introduces Second-Generation Akida Platform.....
General availability will follow in Q3’ 2023."


Hoping to see something soon.🤞
 
  • Like
  • Love
  • Fire
Reactions: 39 users

hotty4040

Regular
Morning Equanimous,

Cheers for checking.

Normally I can speak into my phone , Brainchip Twitter, and it takes me directly to our twitter page where I can paruse to a limited degree.

Today for whatever reason this article popped up first & when clicked on it went to twitter then kept locking up.

Thanks once again, I'll leave the sleuthing to the professionals.

Regards,
Esq.

Morning Equanimous,

Cheers for checking.

Normally I can speak into my phone , Brainchip Twitter, and it takes me directly to our twitter page where I can paruse to a limited degree.

Today for whatever reason this article popped up first & when clicked on it went to twitter then kept locking up.

Thanks once again, I'll leave the sleuthing to the professionals.

Regards,
Esq.
Whe're a bit light on discussion this morning. Any thoughts on the new/fresh patent that we've been informed of. Surely this is good news and would warrant some sort of announcement I would have thought. or at least some soothing comments etc, from holders.

Are well, it is " monday morning " so that might explain things to a degree.

gltah for the commencement of this new week in the market, maybe things will improve as we progress some more.

Akida Ballista comrades.

hotty...
 
  • Like
Reactions: 12 users
Whe're a bit light on discussion this morning. Any thoughts on the new/fresh patent that we've been informed of. Surely this is good news and would warrant some sort of announcement I would have thought. or at least some soothing comments etc, from holders.

Are well, it is " monday morning " so that might explain things to a degree.

gltah for the commencement of this new week in the market, maybe things will improve as we progress some more.

Akida Ballista comrades.

hotty...
I think everyone's trying to regather themselves from all the EOFY shenanigans, haha. A new FY, looking forward to what lies ahead :)
 
  • Like
Reactions: 11 users

robsmark

Regular
Whe're a bit light on discussion this morning. Any thoughts on the new/fresh patent that we've been informed of. Surely this is good news and would warrant some sort of announcement I would have thought. or at least some soothing comments etc, from holders.

Are well, it is " monday morning " so that might explain things to a degree.

gltah for the commencement of this new week in the market, maybe things will improve as we progress some more.

Akida Ballista comrades.

hotty...
I think is a huge milestone for reducing future competition and possible forcing hand of Akida adoption, but I don’t expect any short term benefit or rerating of the SP.
 
Last edited:
  • Like
  • Thinking
Reactions: 6 users
Good to see power performance discussed.

 
  • Like
  • Fire
Reactions: 14 users

Boab

I wish I could paint like Vincent
We are now in Q3 2023.

"BrainChip Introduces Second-Generation Akida Platform.....
General availability will follow in Q3’ 2023."


Hoping to see something soon.🤞
Segmentation.jpg
 
  • Like
  • Love
  • Fire
Reactions: 16 users

MrNick

Regular
 
  • Like
  • Fire
  • Love
Reactions: 18 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
Reactions: 11 users

Kachoo

Regular
Whe're a bit light on discussion this morning. Any thoughts on the new/fresh patent that we've been informed of. Surely this is good news and would warrant some sort of announcement I would have thought. or at least some soothing comments etc, from holders.

Are well, it is " monday morning " so that might explain things to a degree.

gltah for the commencement of this new week in the market, maybe things will improve as we progress some more.

Akida Ballista comrades.

hotty...
It's been stated before that the board sees value in all patents obviously but not all of them are announcement worthy. It may just be added armour to one already issued.

I believe thia was state's in one of Sean's recent Q A s .
 
  • Like
Reactions: 5 users

Kachoo

Regular
It's been stated before that the board sees value in all patents obviously but not all of them are announcement worthy. It may just be added armour to one already issued.

I believe thia was state's in one of Sean's recent Q A s .
Then again it would require board approval and today is only Sunday in the US so we may have to wait till Monday.
 
  • Like
Reactions: 3 users

Diogenese

Top 20
Nice pick up Mr Romper,

The full documentation is not up on Espacenet yet (published 20230629).

Inventors:
MCLELLAND DOUGLAS [FR]; CARLSON KRISTOFOR D [US]; JOHNSON KEITH WILLIAM [AU]; JOSHI MILIND [AU]

No PvdM as inventor.

Milind Joshi is our patent attorney in Perth.



US2023206066A1 SPIKING NEURAL NETWORK

Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.


View attachment 39151
Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.



View attachment 39152
Looks like the supplementary spiking neurons are 1002, 1004.

One thing it is designed to do is to adjust the multi-bit weights, so I guess that's why they need the ALU.

This patent, which was filed in December 2021, addresses supervised learning, semi-supervised learning, and autonomous learning in multi-bit (4-bit) weights and activations.

Autonomous learning is a significant feature.

Akida 1000 has 4-bit weights and activations, so this must be an alternative improved technique.

The patent provides for alternative ways of recalculating the weights and activations apart from the ALU, but the ALUs are shown at 152 in figure 1.

1688349172183.png



1688349037635.png
 
  • Like
  • Love
  • Fire
Reactions: 26 users

Xray1

Regular
Yes @Mt09 I'm feeling it too. Sean did say his Sunday afternoon/evening hookup with all the company sales and marketing team will hopefully have something very positive to announce.
Can't remember if he said it at the AGM or the interview with Noah the next day?
He has said a lot of things at the 2 AGM's and in podcast's, but seemingly enough imo nothing really seems to eventuate from it ... that's why I suppose the Co is currently in a "Strike One" zone.
 
  • Like
Reactions: 4 users

Xray1

Regular
Nice pick up Mr Romper,

The full documentation is not up on Espacenet yet (published 20230629).

Inventors:
MCLELLAND DOUGLAS [FR]; CARLSON KRISTOFOR D [US]; JOHNSON KEITH WILLIAM [AU]; JOSHI MILIND [AU]

No PvdM as inventor.

Milind Joshi is our patent attorney in Perth.



US2023206066A1 SPIKING NEURAL NETWORK

Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.


View attachment 39151
Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.



View attachment 39152
Looks like the supplementary spiking neurons are 1002, 1004.

One thing it is designed to do is to adjust the multi-bit weights, so I guess that's why they need the ALU.
Nice pick up Mr Romper,

The full documentation is not up on Espacenet yet (published 20230629).

Inventors:
MCLELLAND DOUGLAS [FR]; CARLSON KRISTOFOR D [US]; JOHNSON KEITH WILLIAM [AU]; JOSHI MILIND [AU]

No PvdM as inventor.

Milind Joshi is our patent attorney in Perth.



US2023206066A1 SPIKING NEURAL NETWORK

Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.


View attachment 39151
Disclosed herein are system, method, and computer program embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised, semi-supervised, and supervised extraction of features from an input dataset. An embodiment operates by receiving a modification request to modify a base neural network, having N layers and a plurality of spiking neurons, trained using a primary training dataset. The base neural network is modified to include supplementary spiking neurons in the Nth or N + 1th layer of the base neural network. The embodiment includes receiving a secondary training dataset and determining membrane potential values of one or more supplementary spiking neurons in the Nth or Nth + 1 layer which learn features based on secondary training data set to select a supplementary/winning spiking neuron. The embodiment performs a learning function for the modified neural network based on the winning spiking neuron.



View attachment 39152
Looks like the supplementary spiking neurons are 1002, 1004.

One thing it is designed to do is to adjust the multi-bit weights, so I guess that's why they need the ALU.
Diogenese ........... Interesting point that there is " No PvdM " as inventor nor Anil
Could there be a conflicit somewhere for such non inclusion ?
 
  • Haha
  • Thinking
Reactions: 2 users

Diogenese

Top 20
This patent, which was filed in December 2021, addresses supervised learning, semi-supervised learning, and autonomous learning in multi-bit (4-bit) weights and activations.

Autonomous learning is a significant feature.

Akida 1000 has 4-bit weights and activations, so this must be an alternative improved technique.

The patent provides for alternative ways of recalculating the weights and activations apart from the ALU, but the ALUs are shown at 152 in figure 1.

View attachment 39177


View attachment 39176

The claims set out the specifics of the invention. In this case the invention is directed to machine learning, and it does this by adding NPUs to the final layer of a NN. The final layer is where the learning takes place. The "secondary training data set" spikes could be the activation event spikes from the sensor (camera/microphone/...), the primary training data set having been provided by the model library data used in initial configuration.

Supplementary spiking neurons (NPUs) are added to the final layer (Fig 10, 1002, 1004) where Akida does its learning, presumably to incorporate newly learned features. The ALUs of Figure 1 would be involved in the step of "performing a learning function ... by performing a synaptic weight value variation ...", bearing in mind that this is for multi-bit weights and activations.


1688352452854.png
 
  • Like
  • Love
  • Fire
Reactions: 18 users

Diogenese

Top 20
The claims set out the specifics of the invention. In this case the invention is directed to machine learning, and it does this by adding NPUs to the final layer of a NN. The final layer is where the learning takes place. The "secondary training data set" spikes could be the activation event spikes from the sensor (camera/microphone/...), the primary training data set having been provided by the model library data used in initial configuration.

Supplementary spiking neurons (NPUs) are added to the final layer (Fig 10, 1002, 1004) where Akida does its learning, presumably to incorporate newly learned features. The ALUs of Figure 1 would be involved in the step of "performing a learning function ... by performing a synaptic weight value variation ...", bearing in mind that this is for multi-bit weights and activations.


View attachment 39178
Aha! Penny's dropped. Remember when 8-bit weights were announced?

This change may be to accommodate 8-bit weights/activations.

The ALUs may be more efficient at handling the multi-bit "spikes" than the original Akida configuration.

I found this Sanskrit engraving on Eric von Dunnycan's tomb:

https://doc.brainchipinc.com/_modules/akida_models/imagenet/model_mobilenet.html
...
weight_quantization (int, optional): sets all weights in the model to have a particular quantization bitwidth except for the weights in the first layer.
Defaults to 0.

* '0' implements floating point 32-bit weights.
* '2' through '8' implements n-bit weights where n is from 2-8 bits.
activ_quantization (int, optional): sets all activations in the model to have a particular activation quantization bitwidth.
Defaults to 0.
...
input_scaling (tuple, optional): scale factor and offset to apply to
inputs. Defaults to (128, -1). Note that following Akida convention, the scale factor is an integer used as a divider
.
...
© Copyright 2022, BrainChip Holdings Ltd. All Rights Reserved.

If I recall correctly, it is only the weights that are 8-bit, and only for the purpose of compatibility with 3rd party model libraries.

If there are 8-bit weights and 4-bit activations, an 8*4 matrix would be used.
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 72 users
Top Bottom