BRN Discussion Ongoing

Labsy

Regular
Interesting experience in Miele store today. Went with the Mrs to have a look at some appliances for a new kitchen we plan on building... the sales assistant asked when we plan on renovating our house. I mentioned probably begin late next year. She said "by then all appliances as seen on show room will change. They will look the same but be different". I said "let me guess...they will intuitive yes? Perhaps some AI". She said " yeeeesss, how did u know?" 😳 haha...
Just goes to show how we are definitely in the right place at the right time... I hope there are some Renesas chips in there...
 
  • Like
  • Fire
  • Wow
Reactions: 42 users

Taco77

Member
 

Attachments

  • 24EA195E-B587-437D-8086-C26CC1BD96CA.jpeg
    24EA195E-B587-437D-8086-C26CC1BD96CA.jpeg
    243.7 KB · Views: 92
  • Like
  • Haha
Reactions: 2 users
From my informed lay perspective it is using watts of power so on this basis it does not inhabit the same territory. It does not have incremental and one shot learning.

What is unclear to me is whether it might have potential to reduce the training parameters of the CNN used by AKIDA but also not sure if this is an actual advantage.

Though it sends less data it still requires connection which means unlike AKIDA it could drop out in my carport like my iPhone and drive my car through the back wall.

It is being implemented in 40nm and they do not state if it is able to scale down but even so if it can and chased AKIDA down to 4nm it would not catch up.

Subject to Diogenes view
My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 16 users

Only problem is that the German Government would most likely reject such a takeover.

Also not sure why Mercedes would be interested having shown how advanced its EQXX EV technology is what would they actually get out of the deal.

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
Reactions: 9 users

Learning

Learning to the Top 🕵‍♂️
Hi all,

This is my opinion of the unusual volume at close auction today, nearly 20 Million shares were traded after the market closed.

We all know today is the Quarterly Rebalance of S&P/ASX.
Unusual volume happen, when rebalance occur as you enter or exist the ASX 200/300.

However, BRN didn't enter ASX 200 or leaving ASX 300, so why the unusual volume?

My option is that's some of the indices has started to take a holding position in BRN and anticipate for BRN to enter the ASX 200. In the end BRN didn't enter the ASX 200 in this quarter. So these indices has to unload there BRN holding today.

As to who were the buyer after closed auction; Could the short take this opportunity to close there short positions in anticipation of good news coming from BRN. (One can hope).

Its great to be a shareholder.
 
  • Like
  • Thinking
  • Fire
Reactions: 14 users

Diogenese

Top 20
This is way above my pay grade - I'v been flying by the seat of my pants before this, so I think this is in the Kristofor Karlson, Simon Thorpe, PvdM balliwick.

However, if I had to guess, I'd say that our on-chip, one-shot learning would make this irrelevant for Akida.

1647609395000.png


Weight generation and storage generally happen off-chip, creating a power and latency bottleneck related to the movement of this data to and from external memory. Instead, the Hiddenite architecture features on-chip weight generation for re-generating weights through a random number generator, effectively eliminating the need to access the external memory. Beyond this, Hiddenite offers "on-chip supermask expansion,” a feature that reduces the number of supermasks that need to be loaded by the accelerator. 0

Fabricated on TSMC's 40nm technology, the chip measures 3 mm x 3 mm and is capable of performing 4,096 multiply-and-accumulate operations simultaneously. The researchers further claim it achieves a maximum of 34.8 TOPS per watt, all while reducing the amount of model transfer to half that of binarized networks.



Akida
US2020143229A1 SPIKING NEURAL NETWORK
1647609584502.png


[0123] In some embodiments, when a logical AND operation is performed on a spike bit in the spike packet that is ‘1’ and a synaptic weight that is zero, the result is a zero. This can be referred to as an ‘unused spike.’ When a logical AND operation is performed on a spike bit in the spike packet that is ‘0’ and a synaptic weight that is ‘1’, the result is zero. This can be referred to as an ‘unused synaptic weight’. The learning circuit (e.g., weight swapper 113 ) can swap random selected unused synaptic weights where unused spikes occur.

By the way, I just revisited Kristofor Karlson's TinyML talk from March last year:

Watch It’s an SNN future: Are you ready for it? Converting CNN’s to SNN’s - Talk Video by Kristofor Karlson | ConferenceCast.tv

BrainChip provides a number of proprietary models:

1647609699902.png
 
Last edited:
  • Like
  • Fire
Reactions: 16 users

Diogenese

Top 20
This is way above my pay grade - I'v been flying by the seat of my pants before this, so I think this is in the Kristofor Karlson, Simon Thorpe, PvdM balliwick.

However, if I had to guess, I'd say that our on-chip, one-shot learning would make this irrelevant for Akida.

View attachment 2815

Weight generation and storage generally happen off-chip, creating a power and latency bottleneck related to the movement of this data to and from external memory. Instead, the Hiddenite architecture features on-chip weight generation for re-generating weights through a random number generator, effectively eliminating the need to access the external memory. Beyond this, Hiddenite offers "on-chip supermask expansion,” a feature that reduces the number of supermasks that need to be loaded by the accelerator. 0

Fabricated on TSMC's 40nm technology, the chip measures 3 mm x 3 mm and is capable of performing 4,096 multiply-and-accumulate operations simultaneously. The researchers further claim it achieves a maximum of 34.8 TOPS per watt, all while reducing the amount of model transfer to half that of binarized networks.



Akida
US2020143229A1 SPIKING NEURAL NETWORK View attachment 2816

[0123] In some embodiments, when a logical AND operation is performed on a spike bit in the spike packet that is ‘1’ and a synaptic weight that is zero, the result is a zero. This can be referred to as an ‘unused spike.’ When a logical AND operation is performed on a spike bit in the spike packet that is ‘0’ and a synaptic weight that is ‘1’, the result is zero. This can be referred to as an ‘unused synaptic weight’. The learning circuit (e.g., weight swapper 113 ) can swap random selected unused synaptic weights where unused spikes occur.

By the way, I just revisited Kristofor Karlson's TinyML talk from March last year:

Watch It’s an SNN future: Are you ready for it? Converting CNN’s to SNN’s - Talk Video by Kristofor Karlson | ConferenceCast.tv

BrainChip provides a number of proprietary models:

View attachment 2817

While I was browsing the above Akida patent, my memory was jogged when I re-read some advantages of the Akida digital SNN compared with conventional SNNs:

US2020143229A1 SPIKING NEURAL NETWORK

[0038] But conventional SNNs can suffer from several technological problems. First, conventional SNNs are unable to switch between convolution and fully connected operation. For example, a conventional SNN may be configured at design time to use a fully-connected feedforward architecture to learn features and classify data. Embodiments herein (e.g., the neuromorphic integrated circuit) solve this technological problem by combining the features of a CNN and a SNN into a spiking convolutional neural network (SCNN) that can be configured to switch between a convolution operation or a fully-connected neural network function. The SCNN may also reduce the number of synapse weights for each neuron. This can also allow the SCNN to be deeper (e.g., have more layers) than a conventional SNN with fewer synapse weights for each neuron. Embodiments herein further improve the convolution operation by using a winner-take-all (WTA) approach for each neuron acting as a filter at particular position of the input space. This can improve the selectivity and invariance of the network. In other words, this can improve the accuracy of an inference operation.

[0039] Second, conventional SNNs are not reconfigurable. Embodiments herein solve this technological problem by allowing the connections between neurons and synapses of a SNN to be reprogrammed based on a user defined configuration. For example, the connections between layers and neural processors can be reprogrammed using a user defined configuration file.

[0040] Third, conventional SNNs do not provide buffering between different layers of the SNN. But buffering can allow for a time delay for passing output spikes to a next layer. Embodiments herein solve this technological problem by adding input spike buffers and output spike buffers between layers of a SCNN.

[0041] Fourth, conventional SNNs do not support synapse weight sharing. Embodiments herein solve this technological problem by allowing kernels of a SCNN to share synapse weights when performing convolution. This can reduce memory requirements of the SCNN.

[0042] Fifth, conventional SNNs often use 1-bit synapse weights. But the use of 1-bit synapse weights does not provide a way to inhibit connections. Embodiments herein solve this technological problem by using ternary synapse weights. For example, embodiments herein can use two-bit synapse weights. These ternary synapse weights can have positive, zero, or negative values. The use of negative weights can provide a way to inhibit connections which can improve selectivity. In other words, this can improve the accuracy of an inference operation.

[0043] Sixth, conventional SNNs do not perform pooling. This results in increased memory requirements for conventional SNNs. Embodiments herein solve this technological problem by performing pooling on previous layer outputs. For example, embodiments herein can perform pooling on a potential array outputted by a previous layer. This pooling operation reduces the dimensionality of the potential array while retaining the most important information.

[0044] Seventh, conventional SNN often store spikes in a bit array. Embodiments herein provide an improved way to represent and process spikes. For example, embodiments herein can use a connection list instead of bit array. This connection list is optimized such that each input layer neuron has a set of offset indexes that it must update. This enables embodiments herein to only have to consider a single connection list to update all the membrane potential values of connected neurons in the current layer.

[0045] Eighth, conventional SNNs often process spike by spike. In contrast, embodiments herein can process packets of spikes. This can cause the potential array to be updated as soon as a spike is processed. This can allow for greater hardware parallelization.

[0046] Finally, conventional SNNs do not provide a way to import learning (e.g., synapse weights) from an external source. For example, SNNs do not provide a way to import learning performed offline using backpropagation. Embodiments herein solve this technological problem by allowing a user to import learning performed offline into the neuromorphic integrated circuit
.
 
  • Like
  • Fire
  • Love
Reactions: 15 users

Diogenese

Top 20
This Technion patent application makes reference to you-know-whom:

WO2021214763A1 DEVICE AND METHOD FOR RAPID DETECTION OF VIRUSES
(3rd last para of description)
The data was analyzed by Brainchip with a Spiking Neural Network, the adjacent confusion matrix shows the results on the test set. The test set included 31 samples- 21 positives and 10 negatives from 21 tested subjects. Zero out of 21 positive samples were identified correctly which represents 100% sensitivity and 4 out of 10 negative samples were identified correctly which represents 40% specificity. The overall accuracy was 80.65%
(I think they got their "nots" in a twist)
...
The same data set (NB: not the same as above) was analyzed also by the SNN methodology. To make the SNN most efficient, 34 samples were discarded due to noise or improper vector dimensionality. Thus, the dataset included 131 samples taken from 126 subjects tested with Sniffphone device at Zayed Military Hospital- 62 samples from 62 COVID-19 positive subjects and 69 samples from 64 COVID-19 negative subjects (Several negative subjects were sampled two or three times). The adjacent confusion matrix shows the results on the test set that that was completely blind to the training and validation of the model. The test set included 53 samples - 20 positive and 33 negative samples from 53 tested subjects. Nineteen out of 20 positive samples were identified correctly which represents 95% sensitivity and 29 out of 33 negative samples were identified correctly which represents 87.87 % specificity. The overall accuracy was therefore 90.5%.

Footnote: @uiux has already posted this patent on the NaNose thread.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 23 users
D

Deleted member 118

Guest
Just posted some cool videos by Professor Haick on the Nanose page and on a plus side @Fact Finder you don’t have to read anything.

 
Last edited by a moderator:
  • Like
  • Sad
Reactions: 7 users
🏁 Brainchip off to the races??!!!??! 🏁

Pretty amazing paragraph from Mercedes just hours ago on LinkedIn

I think Brainchip are getting into Formula 1!

6C005AC0-5B3F-4683-B789-696B104D8014.jpeg


94331A12-D73F-4A39-8C06-DBCF589810F5.jpeg


 
Last edited:
  • Like
  • Fire
Reactions: 70 users

MrNick

Regular
Thanks all for the considered responses. F1 news is interesting. Lewis for the magic 8 ball GOAT title.
 
  • Like
Reactions: 6 users
D

Deleted member 118

Guest
Thanks all for the considered responses. F1 news is interesting. Lewis for the magic 8 ball GOAT title.

The car is having a few issues in practice and if they can sort that out there should be no reason why Lewis can’t challenge for the title again this year
 
  • Like
Reactions: 2 users
D

Deleted member 118

Guest
 
  • Like
  • Fire
  • Wow
Reactions: 17 users

Evermont

Stealth Mode
2022 AI Index Report has been released.


"The AI Index is an independent initiative at the Stanford Institute for Human-Centered Artificial Intelligence (HAI), led by the AI Index Steering Committee, an interdisciplinary group of experts from across academia and industry. The annual report tracks, collates, distills, and visualizes data relating to artificial intelligence, enabling decision-makers to take meaningful action to advance AI responsibly and ethically with humans in mind.

The latest edition includes data from a broad set of academic, private, and nonprofit organizations as well as more self-collected data and original analysis than any previous editions, including an expanded technical performance chapter, a new survey of robotics researchers around the world, data on global AI legislation records in 25 countries, and a new chapter with an in-depth analysis of technical AI ethics metrics."


Top takeaways are listed below.

There is no reference to BrainChip in the report which is a bit disappointing as some meaningful data on how Akida would perform under Section 2 would be very interesting. Enjoy.

1647641827666.png
 
  • Like
  • Fire
Reactions: 9 users

MrNick

Regular
The car is having a few issues in practice and if they can sort that out there should be no reason why Lewis can’t challenge for the title again this year
Netlix’ Drive to Survive is a cracker. Didn’t realise how despised Mazepin was - couldn’t drive either, Schumacher was class despite the elephant in the roomski.
 
  • Like
Reactions: 5 users
Here are some observations of the tech shift from the Trump era to the Biden era (even though it sounds like their bagging Trump a bit):


"plenty of important stories have transcended the shift in presidents: among them are tech labor issues; diversity, equity, and inclusion; cybersecurity; the long-term effect on COVID-19 on companies and the economy; VR and gaming; and the uses and misuses of artificial intelligence."

"there are many more shifts you could probably name that would support a full-time tech reporter at any publication: the heightened importance of chip manufacturing and innovation;..."
 
  • Like
Reactions: 4 users
Just posted some cool videos by Professor Haick on the Nanose page and on a plus side @Fact Finder you don’t have to read anything.

I hate videos. So painfully slow. I need to fast forward and jump around at my pace which I can do when reading . The good Professor spoke so slowly when he was younger I wanted to scream. 😂🤣
Had to give up even during his more recent ones about healing skin. Not sure if there is anything I missed as a result so it will be up to others to decide.

HE DID EXPOSE that deep learning fell over when they went from the laboratory to the clinic as it could not deal with the contaminants in the real world when sampling breath and their accuracy dropped to 50%. Clearly AKD1000 did not have this drop in performance and this is why AKIDA RULES.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Haha
Reactions: 21 users

Boab

I wish I could paint like Vincent
I know this is of topic but if anyone has a digital subscription to the Weekend Australian there is a wonderful bit of advice from a couple of cardiologists on page 3. Given that a lot of people on this thread (I'm guessing) are of a certain age would find very useful.
The article is titled "Surprise heart deaths could end"
 
  • Like
  • Love
Reactions: 7 users
D

Deleted member 118

Guest
I hate videos. So painfully slow. I need to fast forward and jump around at my pace which I can do when reading . The good Professor spoke so slowly when he was younger I wanted to scream. 😂🤣
Had to give up even during his more recent ones about healing skin. Not sure if there is anything I missed as a result so it will be up to others to decide.

HE DID EXPOSE that deep learning fell over when they went from the laboratory to the clinic as it could not deal with the contaminants in the real world when sampling breath and their accuracy dropped to 50%. Clearly AKD1000 did not have this drop in performance and this is why AKIDA RULES.

My opinion only DYOR
FF

AKIDA BALLISTA

Glad you lasted longer than me watching them lol
 
  • Haha
Reactions: 1 users
Top Bottom