BRN Discussion Ongoing

Hello Partner 😉


3C36AB78-CC6E-4899-A4EF-F4BF206689E5.jpeg
 
  • Like
  • Fire
Reactions: 26 users

Diogenese

Top 20
I'm guessing I'm not the only one wondering.

How Transformers Work
https://towardsdatascience.com/transformers-141e32e69591

___
"In this way, LSTMs can selectively remember or forget things that are important and not so important."
harrr again ❤️‍🔥
Hi cosors,

Thanks for posting this.

The Attention function requires a lot of multiplication:

"The first step in calculating self-attention is to create three vectors from each of the encoder’s input vectors (in this case, the embedding of each word). So for each word, we create a Query vector, a Key vector, and a Value vector. These vectors are created by multiplying the embedding by three matrices that we trained during the training process."
...
"The score is calculated by taking the dot product of the query vector with the key vector of the respective word we’re scoring."
...
"pass the result through a softmax operation. Softmax normalizes the scores so they’re all positive and add up to 1.
...
This softmax score determines how much how much each word will be expressed at this position. Clearly the word at this position will have the highest softmax score, but sometimes it’s useful to attend to another word that is relevant to the current word.
The fifth step is to multiply each value vector by the softmax score (in preparation to sum them up). The intuition here is to keep intact the values of the word(s) we want to focus on, and drown-out irrelevant words (by multiplying them by tiny numbers like 0.001, for example).
The sixth step is to sum up the weighted value vectors. This produces the output of the self-attention layer at this position (for the first word)
."

Given that Akida avoids MAC operations in its 4-bit mode, I wonder if Akida can be configured to avoid one or more of these multiplication steps.

As far as I can make out, self-attention involves selecting important words (subject (noun), verb, and object (noun/adverb/adjective)) ... and sentences can be much more complex.

This describes the treatment of a short sentence. To obtain the context, it may be necessary to process a larger chunk of text, such as a whole paragraph. The mind boggles!
 
  • Like
  • Fire
  • Love
Reactions: 28 users

Earlyrelease

Regular
Hi cosors,

Thanks for posting this.

The Attention function requires a lot of multiplication:

"The first step in calculating self-attention is to create three vectors from each of the encoder’s input vectors (in this case, the embedding of each word). So for each word, we create a Query vector, a Key vector, and a Value vector. These vectors are created by multiplying the embedding by three matrices that we trained during the training process."
...
"The score is calculated by taking the dot product of the query vector with the key vector of the respective word we’re scoring."
...
"pass the result through a softmax operation. Softmax normalizes the scores so they’re all positive and add up to 1.
...
This softmax score determines how much how much each word will be expressed at this position. Clearly the word at this position will have the highest softmax score, but sometimes it’s useful to attend to another word that is relevant to the current word.
The fifth step is to multiply each value vector by the softmax score (in preparation to sum them up). The intuition here is to keep intact the values of the word(s) we want to focus on, and drown-out irrelevant words (by multiplying them by tiny numbers like 0.001, for example).
The sixth step is to sum up the weighted value vectors. This produces the output of the self-attention layer at this position (for the first word)
."

Given that Akida avoids MAC operations in its 4-bit mode, I wonder if Akida can be configured to avoid one or more of these multiplication steps.

As far as I can make out, self-attention involves selecting important words (subject (noun), verb, and object (noun/adverb/adjective)) ... and sentences can be much more complex.

This describes the treatment of a short sentence. To obtain the context, it may be necessary to process a larger chunk of text, such as a whole paragraph. The mind boggles!
If Dodgy Knee's mind boggles I am stuffed.
 
  • Haha
  • Like
  • Love
Reactions: 25 users

toasty

Regular
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
 
  • Like
  • Love
  • Fire
Reactions: 42 users
The sound of the pitchforks sharpening in the distance.......
 
  • Haha
  • Like
Reactions: 18 users

alwaysgreen

Top 20
  • Like
  • Haha
  • Love
Reactions: 9 users

stuart888

Regular
Interesting how many results return for "transformer neural network 4-bits".

Likely, the Brainchip geniuses have a best-in-class low-power solution for this, since it is apparently not far out from being official revealed.


1673576347348.png
 
  • Like
  • Love
  • Fire
Reactions: 20 users
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
Hard work hey @toasty but at some stage there will be a tipping point. When? How long is a piece of string. I have been in since 2015 and although like everyone else would like SP higher, the fundamentals are better than they have ever been, and a damn sight better than when we first came on board. Slow burner this one but damn, when it runs, it will really run. Just hope the shorters are the ones that are trampled in the stampede.

SC
 
  • Like
  • Love
  • Fire
Reactions: 43 users

stuart888

Regular
Interesting how many results return for "transformer neural network 4-bits".

Likely, the Brainchip geniuses have a best-in-class low-power solution for this, since it is apparently not far out from being official revealed.


View attachment 26953
Yeah, yeah: 4 Bits Are Enough was the 6th item listed, when searching google for: "transformer neural network 4-bits".

1673576949609.png
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Cyw

Regular
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
I don't think the next 4C will be a block buster as they are raising cash now. Why would you raise cash now if you know you are overflow with cash and the explosive 4C will take the SP to over $1? Perhaps the third quarter 4C.

Wake me up when we get there.
 
  • Like
Reactions: 13 users

stuart888

Regular
  • Haha
  • Thinking
  • Like
Reactions: 11 users

Diogenese

Top 20
Socionext have posted an article on their website about what they got up to at CES 2023 (Brainchip has been included as well):


Hi SFB,

Great find:

Top right "NNA" = neural network accelerator - Akida baked into their processor!

... and immediately under NNA, "User defined Logic" allows customers to customize the processor.

1673577038327.png


Advanced AI Solutions for Automotive​

Socionext has partnered with artificial intelligence provider BrainChip to develop optimized, intelligent sensor data solutions based on Brainchip’s Akida® processor IP.

BrainChip’s flexible AI processing fabric IP delivers neuromorphic, event-based computation, enabling ultimate performance while minimizing silicon footprint and power consumption. Sensor data can be analyzed in real-time with distributed, high-performance and low-power edge inferencing, resulting in improved response time and reduced energy consumption
.

How good is that?!
 
  • Like
  • Love
  • Fire
Reactions: 67 users
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
Hey mate,

Appreciate the honesty. I’m in a similar time loop to you and can relate to the sentiment about the share price. I’m not sure about the NASDAQ listing value being materially different. An example is IONQ, they have the worlds most advanced known quantum computer for commercial use and have consistently grown their revenue with a valuation of 900k USD. The revenue might be on a scale that we might achieve but it is something to be mindful off.

On the shorters/day traders note, one thing I’m 100% convinced about is that eventually there will be no escape via NDA or other means from disclosing the cash receipts in the 4c and when that happens the shorters are going to get smoked.
 
  • Like
Reactions: 1 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 15 users
Any chance one of you can take a screenshot of market depth expanded to show individual orders please.?
 

Sam

Nothing changes if nothing changes

Attachments

  • 17C6C449-1E1D-4EAF-B385-AF2DE75B134E.png
    17C6C449-1E1D-4EAF-B385-AF2DE75B134E.png
    366.1 KB · Views: 172
  • A6205235-1817-4E43-9B86-BAE94B45CA4D.png
    A6205235-1817-4E43-9B86-BAE94B45CA4D.png
    393.5 KB · Views: 167
  • 40C3178E-4291-4270-A0E7-0EE8E91BCD95.png
    40C3178E-4291-4270-A0E7-0EE8E91BCD95.png
    389 KB · Views: 152
  • B7D9B200-462F-4DCC-9D9D-11FB5294DCFA.png
    B7D9B200-462F-4DCC-9D9D-11FB5294DCFA.png
    369.3 KB · Views: 173
  • Like
  • Love
Reactions: 7 users

Sam

Nothing changes if nothing changes
Lot of support at 65c
 
  • Love
  • Like
Reactions: 2 users

White Horse

Regular
Any chance one of you can take a screenshot of market depth expanded to show individual orders please.?

Can't post them all, to many characters.​

Buyers​

Sellers​

Orders table
Buyers MarketBuyers VolumeBuyers Price $Sellers Price $Sellers VolumeSellers Market
ASX7440.6600.66525,469ASX
ASX11,0000.6600.6655,353ASX
ASX7,5750.6600.6654,680ASX
ASX1440.6600.665687ASX
ASX15,0000.6600.6651,892ASX
ASX3020.6600.66525,000ASX
ASX20,0000.6600.6651,021ASX
ASX7,4240.6600.6654,097ASX
ASX1,5150.6600.665327ASX
ASX5,0170.6600.665949ASX
ASX3,9840.6600.665511ASX
ASX10,0000.6600.665266ASX
ASX4,2170.6600.665104ASX
ASX11,0000.6600.665658ASX
ASX15,1510.6600.66510,000ASX
ASX50,0000.6600.665515ASX
ASX15,1210.6600.6652,125ASX
ASX1,5150.6600.6652,046ASX
ASX4,6000.6600.66522,222ASX
ASX10,0000.6600.6654,326ASX
ASX10,0000.6600.6651,130ASX
ASX9,9300.6600.66525,103ASX
ASX3,0000.6600.6651,834ASX
ASX15,1510.6600.6652,070ASX
ASX5000.6600.66520,000ASX
ASX7600.6600.6651,455ASX
ASX15,0000.6600.6654,218ASX
ASX4540.6600.665480ASX
ASX23,6620.6600.6652,433ASX
ASX15,1510.6600.6651,380ASX
ASX30,0000.6600.66561ASX
ASX20,0000.6600.6651,048ASX
ASX10,0000.6600.66516,025CXA
ASX310.6600.665450CXA
ASX100.6600.6658,000CXA
ASX15,0900.6600.6653,905CXA
ASX710.6600.6653,517CXA
ASX40.6600.6651,865CXA
ASX1,2000.6600.665273CXA
ASX10,0000.6600.665753CXA
ASX30.6600.665129CXA
ASX3,0000.6600.665378CXA
ASX10,0000.6600.665203CXA
ASX310.6600.665106CXA
ASX3,0000.6600.665261CXA
ASX10,0000.6600.665204CXA
ASX10,0000.6600.665862CXA
ASX7,5750.6600.6654,241CXA
ASX10,0000.6600.665846CXA
ASX5,0000.6600.6651,723CXA
ASX1,5000.6600.67063,261ASX
ASX3,0300.6600.6704,216ASX
ASX10,0000.6600.67016,500ASX
ASX1,3000.6600.67010,000ASX
ASX1,4000.6600.6704,009ASX
ASX5000.6600.67013,000ASX
ASX3,5750.6600.6704,742ASX
ASX25,0000.6600.67030,000ASX
ASX2,5720.6600.67032,000ASX
ASX20,0000.6600.67025,000ASX
ASX50,0000.6600.6709,930ASX
ASX12,5000.6600.6701,150ASX
ASX4500.6600.67013,000ASX
ASX250.6600.6704,605ASX
ASX6010.6600.6704,144ASX
CXA7120.6600.6704,000ASX
CXA910.6600.67012,781ASX
CXA320.6600.6701,499ASX
CXA8050.6600.670599ASX
CXA3,4310.6600.670767ASX
CXA3,9530.6600.67026,888ASX
CXA12,0000.6600.67030,000ASX
CXA20,0000.6600.6701,400ASX
CXA4,0420.6600.67030,000CXA
CXA5920.6600.6704,042CXA
CXA600.6600.6702,000CXA
CXA880.6600.67020,000CXA
CXA7,3820.6600.6703,155CXA
CXA20,0000.6600.6704,009CXA
CXA6,9130.6600.67070,000CXA
CXA4,6480.6600.67025,000CXA
CXA20,0000.6600.6703,435CXA
CXA3680.6600.67040,000CXA
CXA1,0240.6600.6703,959CXA
CXA7340.6600.6703,941CXA
CXA4,5000.6600.6703,000CXA
CXA15,0000.6600.670238CXA
CXA2390.6600.670305CXA
ASX9100.6550.67010,000CXA
ASX17,8000.6550.67050,000CXA
ASX2,3600.6550.6706,571CXA
ASX20,0000.6550.67525,000ASX
ASX5,0000.6550.6754,443ASX
ASX13,0000.6550.6752,565ASX
ASX6,1020.6550.67516,500ASX
ASX10,0000.6550.6753,920ASX
ASX10,0000.6550.67517,910ASX
 
  • Like
  • Fire
  • Love
Reactions: 8 users
Cheers lads, much appreciated 👏
 
  • Like
Reactions: 3 users
Top Bottom