BRN Discussion Ongoing

miaeffect

Oat latte lover
Interesting thought. I wonder if the AKIDA can be taught different languages like Chinese, Japanese, Arabic, Spanish etc? As you don't really chat to a car, there will only be a few instructions and will not need a lot of memory to handle that.
Reconfigurable.
Akida is everywhere.
i-am-everywhere-everywhere.gif
 
  • Like
  • Love
Reactions: 8 users

miaeffect

Oat latte lover
Not a far fetched Idea, or maybe Brainchip can apply for an exclusive EQXX dealership. a few thousand units sold just to shareholders.
I will offer my dealer my 500 BRN shares for EQXX.
$500 per share X 500 shares = $250,000. Fair deal!
 
  • Like
  • Fire
  • Haha
Reactions: 20 users
Interesting thought. I wonder if the AKIDA can be taught different languages like Chinese, Japanese, Arabic, Spanish etc? As you don't really chat to a car, there will only be a few instructions and will not need a lot of memory to handle that.
That is the monumental advantage of AKIDA personalisation through on chip learning.

Everyone has an individual voiceprint which includes their accent, dialect, educational attainment, native language, even speech impediments and the so called natural language deep learning systems need to have been pre programmed with huge data sets to have any chance of responding when a user issues a command.

I posted a paper in the last several weeks which was a Google funded research effort that discussed this problem in particular for people with disabilities and calling for samples of speech from people with speech impediments.

Years ago now the NSW Police introduced a speech recognition telephone call centre and it took me a week to work out the way to get it to connect me with Parramatta Police Station was to say Parramadda.

The on chip learning creating personalisation means that for critical functions like voice ID giving you access to your car, house, business or palace even if you have a speech impediment and speak like Prince Charles you can train AKIDA with one or several shots to recognise your commands. The language you speak is completely irrelevant.

This is one of the multitude of unmatched reasons why this extraordinary technology will become ubiquitous in all the World.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
Reactions: 23 users

VictorG

Member
I will offer my dealer my 500 BRN shares for EQXX.
$500 per share X 500 shares = $250,000. Fair deal!
hmmm, my BRN shares are worth $1000 in 2025, so I'll have 2 x EQXX thanks.
 
  • Like
  • Fire
  • Love
Reactions: 13 users

yogi

Regular
  • Like
  • Love
Reactions: 17 users
Just saw this post on Edge Impulse Linked post not long ago.
Some body might be able to shed some light :)

Great pickup. Do not want to be accused of doing a Bravo but this does look very, very promising. FF
 
  • Like
  • Haha
  • Wow
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I recon more like this




Where did you get that video of me Rocket? Is there no privacy?

PS: Be very careful if you are going to attempt this maneuver, I've only just recovered from the burn marks that the bowl left around my neck.
 
  • Haha
  • Like
Reactions: 20 users

VictorG

Member
I was just approved for another $50,000 loan to buy more BRN shares, I had to put up 80 litres petrol as collateral.
 
  • Haha
  • Like
Reactions: 28 users
Great pickup. Do not want to be accused of doing a Bravo but this does look very, very promising. FF
Not AKIDA but an algorithm. Early stage for this research moving forward they hope to be able to design a chip. They do make out the same case for vibration monitoring at the edge at low battery power that Brainchip is promoting and in doing so support Brainchip's business model:

ON-DEVICE LEARNING: A NEURAL NETWORK BASED FIELD-TRAINABLE EDGE AI A PREPRINT
Hiroki Matsutani Keio University 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Japan matutani@arc.ics.keio.ac.jp Mineto Tsukada Keio University 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Japan tsukada@arc.ics.keio.ac.jp Masaaki Kondo Keio University 3-14-1 Hiyoshi, Kohoku-ku, Yokohama, Japan kondo@acsl.ics.keio.ac.jp March 3, 2022

ABSTRACT In real-world edge AI applications, their accuracy is often affected by various environmental factors, such as noises, location/calibration of sensors, and time-related changes. This article introduces a neural network based on-device learning approach to address this issue without going deep. Our approach is quite different from de facto backpropagation based training but tailored for low-end edge devices. This article introduces its algorithm and implementation on a wireless sensor node consisting of Raspberry Pi Pico and low-power wireless module. Experiments using vibration patterns of rotating machines demonstrate that retraining by the on-device learning significantly improves an anomaly detection accuracy at a noisy environment while saving computation and communication costs for low power.

Keywords Machine learning · Neural network · Edge AI · On-device learning · OS-ELM · Wireless sensor node"

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
Reactions: 13 users

Foxdog

Regular
Not a far fetched Idea, or maybe Brainchip can apply for an exclusive EQXX dealership. a few thousand units sold just to shareholders.
Well if it's the dominant tech in the car perhaps BRN shareholders will be rewarded - based on number of shares held.......might be worth starting a rumour 😄
 
  • Like
  • Haha
Reactions: 4 users

Foxdog

Regular
Where did you get that video of me Rocket? Is there no privacy?

PS: Be very careful if you are going to attempt this maneuver, I've only just recovered from the burn marks that the bowl left around my neck.
 
  • Like
Reactions: 2 users
D

Deleted member 118

Guest
Where did you get that video of me Rocket? Is there no privacy?

PS: Be very careful if you are going to attempt this maneuver, I've only just recovered from the burn marks that the bowl left around my neck.

 
  • Like
  • Haha
Reactions: 6 users
The following is a very interesting article.

Interestingly it was submitted for peer review in 2020 but was only finally published last month. In consequence it does not reference Brainchip or AKIDA however it makes out the case that without Spiking Neural Network technology capable of implementing all of the senses robots beyond those which are pre programmed and do things like assemble motor vehicles by performing the same task repeatedly in factories will never be possible. Co-operative robots that work with and beside humans cannot in any practical sense exist under deep learning technology.

Brainchip and long term investors and of course Dr. Arijit Mukherjee from Tata Consulting Services knew this at least by 14 December, 2019. It is a very interesting article despite this and adds further independent corroboration of the journey Peter van der Made and Anil Mankar have taken us all on in their pursuit of Artificial General Intelligence:


My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 32 users

Foxdog

Regular
These articles continue to highlight one significant thing, amongst others, that AKIDA might well be the 'Holy Grail' of AI. I say 'might' because we still need to see wholesale adoption of the tech. Once Mercedes and others come out and release statements declaring their unfettered allegiance to AKIDA then BRN will be able to stamp its name in history. We may well see IBM, Intel and the like throw their hands up and concede that 'AKIDA is King' and way too far advanced to bother competing with...... wouldn't that be nice 🤔
 
  • Like
Reactions: 21 users

Diogenese

Top 20
  • Haha
  • Like
Reactions: 12 users

1Paddle

Emerged
  • Haha
  • Like
Reactions: 4 users
These articles continue to highlight one significant thing, amongst others, that AKIDA might well be the 'Holy Grail' of AI. I say 'might' because we still need to see wholesale adoption of the tech. Once Mercedes and others come out and release statements declaring their unfettered allegiance to AKIDA then BRN will be able to stamp its name in history. We may well see IBM, Intel and the like throw their hands up and concede that 'AKIDA is King' and way too far advanced to bother competing with...... wouldn't that be nice 🤔
I agree with a twist.

The former CEO Mr. Dinardo said on many occasions that Brainchip had well north of 100 NDA's.

The fact that we have not seen the wholesale adoption of the tech in a corporate environment where our only reliable sources of information being those employed by Brainchip keep saying we are bound by NDA's and cannot tell you what is happening does not mean that wholesale adoption has not already occurred.

We in fact have the almost Monty Python like state of affairs where Mercedes Benz speaks about the product we are invested in but the NDA prevents those who represent our interests talking about Mercedes Benz.

Thus it becomes a question of what you personally believe based on your research and the level of trust you accord to statements by such representatives of the company as Peter van der Made, Anil Mankar, Rob Telson, Sean Hehir and Ken Scarince as to explosive sales, income ramping up, Renesas releasing the MCU this year, NASA is doing vision and other things I cannot talk about, Mercedes Valeo, NASA & Vorago being early adopters, most of the 15 EAP's are likely to proceed to commercial agreements, Brainchip had over 5,000 individual users of Meta TF in 2021, One shot learning is clearly your margin of victory right there and so on.

I am clearly in the converted camp and believe that the word 'might' has no place in my vocabulary when speaking about Brainchip.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Fire
  • Love
Reactions: 60 users

Diogenese

Top 20
That is the monumental advantage of AKIDA personalisation through on chip learning.

Everyone has an individual voiceprint which includes their accent, dialect, educational attainment, native language, even speech impediments and the so called natural language deep learning systems need to have been pre programmed with huge data sets to have any chance of responding when a user issues a command.

I posted a paper in the last several weeks which was a Google funded research effort that discussed this problem in particular for people with disabilities and calling for samples of speech from people with speech impediments.

Years ago now the NSW Police introduced a speech recognition telephone call centre and it took me a week to work out the way to get it to connect me with Parramatta Police Station was to say Parramadda.

The on chip learning creating personalisation means that for critical functions like voice ID giving you access to your car, house, business or palace even if you have a speech impediment and speak like Prince Charles you can train AKIDA with one or several shots to recognise your commands. The language you speak is completely irrelevant.

This is one of the multitude of unmatched reasons why this extraordinary technology will become ubiquitous in all the World.

My opinion only DYOR
FF

AKIDA BALLISTA
"Paramadda"
If you'd lived in Stralia long enuf, that wooduv been second nature. Probly used Kath and Kim to train the database.

Also note that the spellcheck thinks you've got your r's about face at the centre.

PS: I did put in a request to @Zeebot for an English spellcheck. Still, with the cultural imperialism by the leader of the free world's movies etc, I'm p!ssing into the wind (the surest way to get your own back).
 
Last edited:
  • Like
  • Haha
Reactions: 18 users
"Paramadda"
If you'd lived in Stralia long enuf, that wooduv been second nature. Probly used Kath and Kim to train the database.

Also note that the spellcheck thinks you've got your r's about face at the centre.

PS: I did put in a request to @Zeebot for an English spellcheck. till, with the cultural imperialism by the leader of the free world's movies etc, I'm p!ssing into the wind (the surest way to get your own back).
As English is a living language I can see the day where in Stralia those whose names start with the letter "T" will become
Dom, Dammie, Daylor, Domas, Derry, Dim, Dimothy, etc; FF
 
  • Haha
  • Like
Reactions: 7 users

Diogenese

Top 20
Great pickup. Do not want to be accused of doing a Bravo but this does look very, very promising. FF
I've scoured the web for an ogre blowing raspberries without success.

"The four on-device learning instances each with n = 256, N = 32, and m = 256 are implemented in Raspberry Pi Pico. Sigmoid and mean squared error are used as an activation function G and a loss function L, respectively. α is shared by all the four instances, while P and β are individual for each instance. The parameter size is thus nN + 4NN + 4Nm in total. Assuming float32 is used as a number format, the memory usage is 176KiB which can be implemented in the 264KiB SRAM of Raspberry Pi Pico. Execution times of the prediction and sequential training of possible configurations are shown in Figure 3b. The execution time of a prediction is shorter than that of a sequential training but this prediction is executed four times when the number of on-device learning instances is four. A larger N enriches the expressive power while it consumes more memory and thus limiting the input size n. In any cases, their execution times do not overwhelm the others, as shown in Figure 3a."

1646721180628.png
 
  • Like
  • Sad
  • Haha
Reactions: 7 users
Top Bottom