BRN Discussion Ongoing

Esq.111

Fascinatingly Intuitive.
Good Morning Chippers,

This time last year we had great volumes & price increase.

See how we go over the next few days....

I'm feeling lucky.

Regards,
Esq.
 

Attachments

  • 20230113_092705.jpg
    20230113_092705.jpg
    4.4 MB · Views: 128
  • Like
  • Love
  • Fire
Reactions: 39 users

Build-it

Regular

The Australian Rail Track Corporation (ARTC) said it was working with freight customers on a recovery plan.

The Office of the National Rail Safety Regulator has been notified and the Australian Transport Safety Bureau has taken control of the site.

The ARTC said it would provide further details and an update on when the interstate freight corridor would re-open once a full assessment of the area was completed.

The Australian Transport Safety Bureau's (ATSB) chief commissioner Angus Mitchell said a team of safety investigators had been sent to the site.

"They will also obtain and review any recorded data, weather information, witness reports, and relevant train and track operator records."


The regulators need to know the benefits of implementing Akida, I hope BRN can reach out to the regulators and provide the solution they need to implement.




Edge Compute.


Well here we go again another train incident dampening productivity and more reports on how this occurred.

I'm positive with the implementation of Akida this would not of occurred.

To quote PVDM,

The design is optimized for high-performance Machine Learning applications, resulting in efficient, low power consumption while performing thousands of operations simultaneously on each phase of the 300 MHz clock cycle. A unique feature of the Akida neural processor is the ability to learn in real time, allowing products to be conveniently configured in the field without cloud access.

And what about Socionext, as shares for brekky posted.

BrainChip’s flexible AI processing fabric IP delivers neuromorphic, event-based computation, enabling ultimate performance while minimizing silicon footprint and power consumption. Sensor data can be analyzed in real-time with distributed, high-performance and low-power edge inferencing, resulting in improved response time and reduced energy consumption.

Screenshot_20230113-094723_Samsung Internet.jpg


Edge Compute.
 
  • Like
  • Fire
Reactions: 19 users

Easytiger

Regular
Interesting are the use-case hits "brainchip" gets using Google Scholar Citations. 🧑‍🎓

I would assume that the number of true scholarly citations is going to escalate, with all the partnerships, Edge Impulse, and the Brainchip Spiking University kicking up fast.

One can search for Patents, Citations, or Both. Another tool for Akida implementation dot collecting! 🧩

https://scholar.google.com/scholar?as_ylo=2022&q="brainchip"&hl=en&as_sdt=0,10

View attachment 26911
Would be good to see a university research article page in the Brn website
 
  • Like
  • Love
  • Fire
Reactions: 12 users

Sam

Nothing changes if nothing changes
Just bought another 5k worth of shares because I like seeing zero dollars in my bank account 😂
Let’s go brain!!! I think the bus has morphed into a locomotive and once it starts it won’t stop, so get on whilst you can #2023 year of the Brainchip.
 
  • Like
  • Love
  • Fire
Reactions: 35 users

HopalongPetrovski

I'm Spartacus!
Would be good to see a university research article page in the Brn website
Yes, great idea.
Hey Toniiiiiiiii!
How about it? 🤣
 
  • Like
  • Fire
Reactions: 9 users

Easytiger

Regular
Well here we go again another train incident dampening productivity and more reports on how this occurred.

I'm positive with the implementation of Akida this would not of occurred.

To quote PVDM,

The design is optimized for high-performance Machine Learning applications, resulting in efficient, low power consumption while performing thousands of operations simultaneously on each phase of the 300 MHz clock cycle. A unique feature of the Akida neural processor is the ability to learn in real time, allowing products to be conveniently configured in the field without cloud access.

And what about Socionext, as shares for brekky posted.

BrainChip’s flexible AI processing fabric IP delivers neuromorphic, event-based computation, enabling ultimate performance while minimizing silicon footprint and power consumption. Sensor data can be analyzed in real-time with distributed, high-performance and low-power edge inferencing, resulting in improved response time and reduced energy consumption.

View attachment 26929

Edge Compute.

Vic Govt bridge maintenance program being developed and replaced he answer is an edge solution
 

Easytiger

Regular
* As reported, Vic Govt bridge maintenance program being developed has failed and the answer here us an effective and economical edge solution
 
  • Like
  • Fire
  • Sad
Reactions: 9 users

wilzy123

Founding Member
Just bought another 5k worth of shares because I like seeing zero dollars in my bank account 😂
Let’s go brain!!! I think the bus has morphed into a locomotive and once it starts it won’t stop, so get on whilst you can #2023 year of the Brainchip.
yeahhhp.gif
 
  • Haha
  • Like
Reactions: 9 users

Damo4

Regular
  • Like
  • Thinking
Reactions: 3 users

Sam

Nothing changes if nothing changes
  • Like
Reactions: 2 users
  • Like
  • Fire
Reactions: 26 users

Diogenese

Top 20
I'm guessing I'm not the only one wondering.

How Transformers Work
https://towardsdatascience.com/transformers-141e32e69591

___
"In this way, LSTMs can selectively remember or forget things that are important and not so important."
harrr again ❤️‍🔥
Hi cosors,

Thanks for posting this.

The Attention function requires a lot of multiplication:

"The first step in calculating self-attention is to create three vectors from each of the encoder’s input vectors (in this case, the embedding of each word). So for each word, we create a Query vector, a Key vector, and a Value vector. These vectors are created by multiplying the embedding by three matrices that we trained during the training process."
...
"The score is calculated by taking the dot product of the query vector with the key vector of the respective word we’re scoring."
...
"pass the result through a softmax operation. Softmax normalizes the scores so they’re all positive and add up to 1.
...
This softmax score determines how much how much each word will be expressed at this position. Clearly the word at this position will have the highest softmax score, but sometimes it’s useful to attend to another word that is relevant to the current word.
The fifth step is to multiply each value vector by the softmax score (in preparation to sum them up). The intuition here is to keep intact the values of the word(s) we want to focus on, and drown-out irrelevant words (by multiplying them by tiny numbers like 0.001, for example).
The sixth step is to sum up the weighted value vectors. This produces the output of the self-attention layer at this position (for the first word)
."

Given that Akida avoids MAC operations in its 4-bit mode, I wonder if Akida can be configured to avoid one or more of these multiplication steps.

As far as I can make out, self-attention involves selecting important words (subject (noun), verb, and object (noun/adverb/adjective)) ... and sentences can be much more complex.

This describes the treatment of a short sentence. To obtain the context, it may be necessary to process a larger chunk of text, such as a whole paragraph. The mind boggles!
 
  • Like
  • Fire
  • Love
Reactions: 28 users

Earlyrelease

Regular
Hi cosors,

Thanks for posting this.

The Attention function requires a lot of multiplication:

"The first step in calculating self-attention is to create three vectors from each of the encoder’s input vectors (in this case, the embedding of each word). So for each word, we create a Query vector, a Key vector, and a Value vector. These vectors are created by multiplying the embedding by three matrices that we trained during the training process."
...
"The score is calculated by taking the dot product of the query vector with the key vector of the respective word we’re scoring."
...
"pass the result through a softmax operation. Softmax normalizes the scores so they’re all positive and add up to 1.
...
This softmax score determines how much how much each word will be expressed at this position. Clearly the word at this position will have the highest softmax score, but sometimes it’s useful to attend to another word that is relevant to the current word.
The fifth step is to multiply each value vector by the softmax score (in preparation to sum them up). The intuition here is to keep intact the values of the word(s) we want to focus on, and drown-out irrelevant words (by multiplying them by tiny numbers like 0.001, for example).
The sixth step is to sum up the weighted value vectors. This produces the output of the self-attention layer at this position (for the first word)
."

Given that Akida avoids MAC operations in its 4-bit mode, I wonder if Akida can be configured to avoid one or more of these multiplication steps.

As far as I can make out, self-attention involves selecting important words (subject (noun), verb, and object (noun/adverb/adjective)) ... and sentences can be much more complex.

This describes the treatment of a short sentence. To obtain the context, it may be necessary to process a larger chunk of text, such as a whole paragraph. The mind boggles!
If Dodgy Knee's mind boggles I am stuffed.
 
  • Haha
  • Like
  • Love
Reactions: 25 users

toasty

Regular
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
 
  • Like
  • Love
  • Fire
Reactions: 42 users
The sound of the pitchforks sharpening in the distance.......
 
  • Haha
  • Like
Reactions: 18 users

alwaysgreen

Top 20
  • Like
  • Haha
  • Love
Reactions: 9 users

stuart888

Regular
Interesting how many results return for "transformer neural network 4-bits".

Likely, the Brainchip geniuses have a best-in-class low-power solution for this, since it is apparently not far out from being official revealed.


1673576347348.png
 
  • Like
  • Love
  • Fire
Reactions: 20 users
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
Hard work hey @toasty but at some stage there will be a tipping point. When? How long is a piece of string. I have been in since 2015 and although like everyone else would like SP higher, the fundamentals are better than they have ever been, and a damn sight better than when we first came on board. Slow burner this one but damn, when it runs, it will really run. Just hope the shorters are the ones that are trampled in the stampede.

SC
 
  • Like
  • Love
  • Fire
Reactions: 43 users

stuart888

Regular
Interesting how many results return for "transformer neural network 4-bits".

Likely, the Brainchip geniuses have a best-in-class low-power solution for this, since it is apparently not far out from being official revealed.


View attachment 26953
Yeah, yeah: 4 Bits Are Enough was the 6th item listed, when searching google for: "transformer neural network 4-bits".

1673576949609.png
 
  • Like
  • Fire
  • Love
Reactions: 14 users

Cyw

Regular
As long termers here will know. my posts are usually pretty upbeat. I have been heavily into this stock since 2016 so am obviously a patient person. However, I admit to the continued manipulation of this stock slowly wearing me down. I understand that our relationships are protected by NDA's and all that but to see the share price where it is with so much good news flowing over the last few months is VERY frustrating.

I am a big believer in the technology and the team bringing it to the market but damn it if the manipulators aren't having a field day at our (ordinary retail holders) expense. I guess some of it boils down to the fact that the ASX is a technology backwater. If BRN was listed in the US I am confident that the SP would be some multiples of what it currently is.

And I'll tell you something I don't understand. If the new version of AKIDA is currently in pre-production why hasn't there been an announcement about successful initial testing? It is inconceivable that we would be proceeding to tape-out if Peter and Anil were not convinced that it is going to work!!!!

Rant over.....hoping for some commercial progress to be revealed in the upcoming 4C..........
I don't think the next 4C will be a block buster as they are raising cash now. Why would you raise cash now if you know you are overflow with cash and the explosive 4C will take the SP to over $1? Perhaps the third quarter 4C.

Wake me up when we get there.
 
  • Like
Reactions: 13 users
Top Bottom