BRN Discussion Ongoing

As long as you dont post that video of the french girl dancing again...
I agree!

You mean this one don't you?..



Still can't understand what she's singing..
But who cares right?...
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 14 users
  • Like
  • Thinking
Reactions: 6 users
Hi ILL
This is an exciting find and once again generously shared. It is the stuff of secret meetings where it could be shared secretly and used to trade.

Seriously though it is a very significant reveal evidenced by a read of the following link:


The other significant aspect to the authors suggested use of AKIDA for the treatment of neurological diseases such as epilepsy is that there are many scientific research papers available via Google Scholar suggesting the use of SNN technology for this very purpose.

In other words Dr. Elon is not the only one who thinks brainchips have a place in mainstream medical science.

The concept of a portable hand held device for detecting concussive brain injury has also been proposed in the literature.

This is a very big opportunity but like all medical opportunities has a long lead time because of the need to have regulatory approval.

One area that might not take such a long time is a device to check if a dog is carrying a brain injury so as to make it a potential risk to humans.

My opinion only DYOR
Fact Finder
Hi All

Just because negative commentary has a stronger effect on brain chemistry than positive or neutral commentary I thought a little reminder was in order given it was posted a little while ago:

Neuromorphic Medical Image Analysis at the Edge: On-Edge training with the Akida Brainchip

E Bråtman, L Dow - 2023 - diva-portal.org
… By first creating a convolutional neural network model capable of identifying brain haemorrhage and then moving it onto the neuromorphic processor Akida AKD1000, it allowed the…”


My opinion only DYOR
Fact Finder
 
  • Like
  • Love
Reactions: 30 users

Esq.111

Fascinatingly Intuitive.
  • Like
Reactions: 7 users

7für7

Top 20
Looks like the buy side is building so might push towards that or 0.225
Hmm looks more like 0.20 to 0.205 to me but still ok if it holds
 
  • Like
Reactions: 2 users
Hi All
It has often been said that MegaChips is very quiet which is true but it is also MegaChips modus operandi to be very, very discreet. Try as they might however if you look hard enough in the end they slip up.

It does take a little work though but I found the following paper sometime ago now:

Data Augmentation for Edge-AI on-chip Learning

N Yoshida, H Miura, T Matsutani… - 2022 IEEE 8th World …, 2022 - ieeexplore.ieee.org
This study applied data augmentation to improve the inference accuracy of edge-artificial intelligence on-chip learning, which uses the fine-tuning technique with limited knowledge and …”

The first thing to note is that these researchers all work for MegaChips.

The second thing is the abstract states:

“This study applied data augmentation to improve the inference accuracy of edge-artificial intelligence on-chip learning, which uses the fine-tuning technique with limited knowledge and without a cloud server. Subsequently, keyword spotting was adopted as an example of the edge-AI application to evaluate inference accuracy. Investigations revealed that all four data augmentation types contributed to inference accuracy improvements, boosting data augmentation by 5.7 times rather than the one-shot boost without data augmentation recorded previously.”

The following link takes you to a sign in page to which I do not have access however whenever this occurs opening the references can sometimes add insight into whether the paper is of interest. In this case it did and I found:

References & Cited By
1.
Akida enablement platforms, [online] Available: https://brainchip.com/akida-enablement-platforms/.
Hide Context Google Scholar
For example, the BrainChip was released as an evaluation chip AKD1000 [1] to execute finetuning with just one-shot input.
Go To Text
2.
M. Horowitz, "computing's energy problem (and what we can do about it)", IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC), pp. 10-14, 2014.
Hide Context View Article Google Scholar
Subsequent investigations revealed that it realized a neuromorphic neural network, thereby reducing computational costs [2], [3].
Go To Text

3.
S.I. Ikegawa, R. Saiin, Y. Sawada and N. Natori, "Rethinking the role of normalization and residual blocks for spiking neural networks", Sensors, vol. 22, 2022.
Hide Context CrossRef Google Scholar
Subsequent investigations revealed that it realized a neuromorphic neural network, thereby reducing computational costs [2], [3].
Go To Text

4.
Overview of meta TF, [online] Available: https://doc.brainchipinc.com/index.html.
Hide Context Google Scholar
The Akida neuromorphic ML Framework (MetaTF) [4] was adopted to establish the edge-AI computing environment.
Go To Text
5.
P. Warden, "Speech Commands: A Dataset for Limited-Vocabulary Speech Recognition", arXiv, 2018.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
6.
Model zoo performances, [online] Available: https://doc.brainchipinc.com/zoo_performances.html.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[6][7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
7.
DS-CNN/KWS inference, [online] Available: https://doc.brainchipinc.com/examples/general/plot_2_ds_cnn_kws.html.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
8.
S.B. Davis and P. Mermelstein, "Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences", IEEE Transactions on Acoustics Speech, vol. 28, pp. 357-366, 1980.
Hide Context View Article Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text

9.
F. Chollet, Xception: Deep learning with depthwise separable convolutions, 2016, [online] Available: http://arxiv.org/abs/1610.02357.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text”

Based upon the above it is clear that MegaChip was using AKIDA as the Edge environment and we’re able to show a marked improvement in inference accuracy using four different types of data augmentation for training purposes.

My opinion only DYOR
Fact Finder
 
  • Like
  • Love
  • Fire
Reactions: 81 users

Esq.111

Fascinatingly Intuitive.
No Cigar today ..wheel in the next savant.

😁.

Esq
 
  • Haha
  • Like
Reactions: 10 users

Worker122

Regular
Hi All
It has often been said that MegaChips is very quiet which is true but it is also MegaChips modus operandi to be very, very discreet. Try as they might however if you look hard enough in the end they slip up.

It does take a little work though but I found the following paper sometime ago now:

Data Augmentation for Edge-AI on-chip Learning

N Yoshida, H Miura, T Matsutani… - 2022 IEEE 8th World …, 2022 - ieeexplore.ieee.org
This study applied data augmentation to improve the inference accuracy of edge-artificial intelligence on-chip learning, which uses the fine-tuning technique with limited knowledge and …”

The first thing to note is that these researchers all work for MegaChips.

The second thing is the abstract states:

“This study applied data augmentation to improve the inference accuracy of edge-artificial intelligence on-chip learning, which uses the fine-tuning technique with limited knowledge and without a cloud server. Subsequently, keyword spotting was adopted as an example of the edge-AI application to evaluate inference accuracy. Investigations revealed that all four data augmentation types contributed to inference accuracy improvements, boosting data augmentation by 5.7 times rather than the one-shot boost without data augmentation recorded previously.”

The following link takes you to a sign in page to which I do not have access however whenever this occurs opening the references can sometimes add insight into whether the paper is of interest. In this case it did and I found:

References & Cited By
1.
Akida enablement platforms, [online] Available: https://brainchip.com/akida-enablement-platforms/.
Hide Context Google Scholar
For example, the BrainChip was released as an evaluation chip AKD1000 [1] to execute finetuning with just one-shot input.
Go To Text
2.
M. Horowitz, "computing's energy problem (and what we can do about it)", IEEE International Solid-State Circuits Conference Digest of Technical Papers (ISSCC), pp. 10-14, 2014.
Hide Context View Article Google Scholar
Subsequent investigations revealed that it realized a neuromorphic neural network, thereby reducing computational costs [2], [3].
Go To Text

3.
S.I. Ikegawa, R. Saiin, Y. Sawada and N. Natori, "Rethinking the role of normalization and residual blocks for spiking neural networks", Sensors, vol. 22, 2022.
Hide Context CrossRef Google Scholar
Subsequent investigations revealed that it realized a neuromorphic neural network, thereby reducing computational costs [2], [3].
Go To Text

4.
Overview of meta TF, [online] Available: https://doc.brainchipinc.com/index.html.
Hide Context Google Scholar
The Akida neuromorphic ML Framework (MetaTF) [4] was adopted to establish the edge-AI computing environment.
Go To Text
5.
P. Warden, "Speech Commands: A Dataset for Limited-Vocabulary Speech Recognition", arXiv, 2018.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
6.
Model zoo performances, [online] Available: https://doc.brainchipinc.com/zoo_performances.html.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[6][7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
7.
DS-CNN/KWS inference, [online] Available: https://doc.brainchipinc.com/examples/general/plot_2_ds_cnn_kws.html.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text
8.
S.B. Davis and P. Mermelstein, "Comparison of parametric representations for monosyllabic word recognition in continuously spoken sentences", IEEE Transactions on Acoustics Speech, vol. 28, pp. 357-366, 1980.
Hide Context View Article Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text

9.
F. Chollet, Xception: Deep learning with depthwise separable convolutions, 2016, [online] Available: http://arxiv.org/abs/1610.02357.
Hide Context Google Scholar
Subsequently, the google speech command dataset was applied to train KWS [5]–[7], after which audio files were transformed to a Mel-frequency power spectrogram [8], thereby aiding supply to DS-CNN [9].
Go To Text”

Based upon the above it is clear that MegaChip was using AKIDA as the Edge environment and we’re able to show a marked improvement in inference accuracy using four different types of data augmentation for training purposes.

My opinion only DYOR
Fact Finder
Solid work FF, thanks
 
  • Like
  • Fire
Reactions: 15 users

buena suerte :-)

BOB Bank of Brainchip
Afternoon Sb182 ,

Odd , from the ASX site ..... indicates that on 6th feb , yesterday , 146,350 units were taken short.

Regards,
Esq
Yes correct Esqy...confirmation'''

1707281393251.png

1707281424376.png
 
  • Like
  • Fire
  • Love
Reactions: 9 users

Newk R

Regular
plenty of closing manipulation Hahaaaa:cautious:
 
  • Like
Reactions: 2 users
  • Like
Reactions: 2 users

cosors

👀
What has actually happened to this guy?
Is he still waiting for his schnitzel at home and doesn't want to be seen with the delivery service?
View attachment 56133

Mickle, if this makes you uncomfortable and you think it may be against our TSE rules, please report it to dreddb0t, and we know about it. I would have no problem with that. ... decency and dignity, you know, there are such things.
Good morning!
I let myself be lured (?) out of my reserve yesterday. After sleeping on it for a night, I've come to the conclusion that it was unnecessary. The only correct way to deal with these types of people is to ignore them. Attention is their stage. If everyone turned away, they would have lost completely. They want attention to get our money. So I have deleted this post of my own accord.
 
  • Like
  • Love
  • Fire
Reactions: 23 users

cosors

👀
Sure enjoyed the endorphin rush from yesterday's ASX trades.

Follow through on NASDAQ appears less impactful however when one considers that BRCHF was overvalued using the proper exchange rate (1$AUD=0.65$USD) today, would leave the correct pricing of BRCHF at $0.13...not quite there yet (1:33 EST).

However volume of BCHPY which is usually almost nil....so far is 9.7k ie 388k shares of BRN (or BRCHF)

Clearly investor interest in Brainchip is on the rise.....based on recent volume.

Now if we can get the price to cooperate.....

Edit: Of course while typing the price creeps up to 13.1 cents. (I get it...keep typing!)
Screenshot_2024-02-07-06-25-10-17_40deb401b9ffe8e1df2f1cc5ba480b12.jpg

Yesterday's volume in Germany incl. US OTC.
 
  • Like
  • Love
  • Fire
Reactions: 13 users
  • Like
Reactions: 4 users

Labsy

Regular
Tried to add an order to sell 100k @ 5.10 today, but got rejected 😬
Will just have to add to the $5 order maybe tomorrow..😜
 
  • Haha
  • Like
  • Love
Reactions: 15 users
It would have been rude not too spend your last $$$ on a few more shares. Every little counts 🤪

IMG_0004.png
 
  • Like
  • Love
  • Fire
Reactions: 13 users

Esq.111

Fascinatingly Intuitive.
Afternoon Chippers ,

After several beers & much thought....

Thinking the German bourse may have another cracker day followed by us ( largest floating penile colony on earth government by asx , asic & general big Dicks domiciled offshore ), also known as Australia.

Something feels afoot as is evident in last two days fun and games.

May be in for a little of this...



Purely my thoughts .

Regards,
Esq
 
  • Like
  • Haha
  • Love
Reactions: 9 users
Hi all,

A long read but in my opinion worth it.

Here’s an excerpt;

Growth of AI chips over the next decade​

Revenue generated from the sale of AI chips (including the sale of physical chips and the rental of chips via cloud services) is expected to rise to just shy of USD$300 billion by 2034, at a compound annual growth rate of 22% from 2024 to 2034. This revenue figure incorporates the use of chips for the acceleration of machine learning workloads at the edge of the network, for telecom edge, and within data centers in the cloud. As of 2024, chips for inference purposes (both at the edge and within the cloud) comprise 63% of revenue generated, with this share growing to more than two-thirds of the total revenue by 2034.

This is in large part due to significant growth at the edge and telecom edge, as AI capabilities are harnessed closer to the end-user. In terms of industry vertical, IT & Telecoms is expected to lead the way for AI chip usage over the next decade, with Banking, Financial Services & Insurance (BFSI) close behind, and Consumer Electronics behind that. Of these, the Consumer Electronics industry vertical is to generate the most revenue at the edge, given the further rollout of AI into consumer products for the home. More information regarding industry vertical breakout can be found in the relevant AI reports.

73.png

AI chips revenue segmented by purpose.

74.png

Revenue generated by AI chips is set to grow at a CAGR of 22% over the next ten years, up to 2034. Of the revenue generated at this time, AI chips for inference purposes will dominate over those used for AI training, as AI migrates more thoroughly to deployment at the edge of the network.

For more information regarding key trends and market segmentations with regards AI chips over the next ten years, please refer to the two reports: “AI Chips: 2023-2033” and “AI Chips for Edge Applications 2024-2034: Artificial Intelligence at the Edge“.



Enjoy!
 
  • Like
  • Love
  • Fire
Reactions: 24 users

MegaportX

Regular
1707290950711.png





Megaportx
 
  • Like
  • Love
Reactions: 9 users

JB49

Regular
Valeo Scala 3 is due for release in 2024. Does anyone know if it is the start or latter part of the year?

Imagine seeing a nice slice of that $1 Billion worth of pre-order they already have pop up on our books 😍😍
 
  • Like
  • Love
Reactions: 16 users
Top Bottom