BRN Discussion Ongoing

buena suerte :-)

BOB Bank of Brainchip
Been a big believer since 2014, with a nice size parcel that sees us being set for a great early retirement! Keep believing, Akida IS THE FUTURE.
I have joined the Stampede here. Thank you to Some amazing posters (from the other side), and hope to continue the good energy & accurate/researched knowledge & views, right here.
Nice Plate!!! :cool: (y)
 
  • Like
Reactions: 7 users

zeeb0t

Administrator
Staff member
While your

While your there zeebot I have emailed the pdf. Also how do I create or insert a profile photo?

Thank you...

As to your question, go here: https://thestockexchange.com.au/account/account-details

And click the avatar to change it, e.g. :

1643952496492.png
 
  • Like
Reactions: 8 users

Zedjack33

Regular
TMH still copping a pizling.

Some of it quite funny.

Poor buggas.
 
  • Like
Reactions: 10 users

misslou

Founding Member
  • Like
  • Love
Reactions: 9 users

McHale

Regular
In exchanges with MODS on HC it got heated and Sunny slipped up. Quickly removed (totally) a post after me. He was originally caught out over on twitter by someone so i challenged him and ended up in this battle with Owen Rask/sunny123.
I was warned of my account being terminated.
It ties in Owen rask (Rask Investments AFS Lic) with Motley Fool (ex writer there) and TMH with hotcrapper (MODERATOR).
All these and I suspect AFR as Rask/sunny threatened to get them involved with an Article also.
A big SHORTER scam using HC forums to ramp and control the narrative and direction of investor sentiment. Many HC stocks have had Activity Shorter campaigns run by this group it would seem! Rask wa seven bragging about how and when they crashed other stocks etc in HC.
I have spent last 2 days screensaving his posts.
Hi @Yak52, what makes this whole story all the more reprehensible on the part of sunny123 et al is that,
the BRN forum has a large majority of that time been the most visited forum on HC for at least the last
12 to 18 months.

So these people who run that business and TMH are well aware that there is a lot of retail interest in BRN,
and of course liquidity on BRN has been building for some time too.

They see it every day

It all adds up to a really bad look for them, manipulative and a breach of trust to say the very least, I believe
I have seen it written here that BRN have been notified, has the ASX or ASIC been informed. Does anyone know ?

Is it stretching the bow too far to say that it fits the description of Activist Shorting ? I heard someone say if
it looks like a duck, and quacks like a duck.....then maybe it is a duck.

The ASX finished small green today (after a huge red night on Wall St, although futures have been looking Higher),
Todays strong rebound by BRN just adds another layer of intrigue to the story.

At the end of the day I don't believe any amount of shorting will hold BRN down for very long, or to any great degree,
I'm quietly confident on BRNs short to medium term prospects. Wave 3 is in the offing, maybe it has already commenced.
 
  • Like
Reactions: 38 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
Reactions: 1 users

sleepymonk

Regular
  • Like
Reactions: 3 users

buena suerte :-)

BOB Bank of Brainchip
oops I'm in trouble ...........:p:p:p:eek:
DateMessage
17 minutes agoAdvertising is a breach of TOU's. If done again, your account may be suspended. Admin Team
 
  • Like
  • Haha
Reactions: 16 users

Tuliptrader

Regular
I can't disclose what it will do, as that won't be any fun. I would rather they just suddenly wonder what on earth has happened.

The paid down ramper, bloodshot eyed, wearily hunched over his bright white screen in his otherwise dimly lit hovel. He pushes send on his latest digital double dealing on a new site thestockexchange. Suddenly, there is an urgent thumping on the door, he looks up, terror in his eyes.......


TT
 
  • Like
  • Haha
Reactions: 13 users

BaconLover

Founding Member
I am cancelling my Netflix.
 
  • Like
  • Haha
Reactions: 17 users

TechGirl

Founding Member
This explains in simple terms just how effective Akida is
https://www.eetindia.co.in/keyword-spotting-making-an-on-device-assistant-a-reality/

Article By : Peter AJ van der Made, BrainChip



Natural-language processing technique known as keyword spotting is gaining traction with the proliferation of smart appliances controlled by voice commands.

Voice assistants from Amazon, Google, Apple and others can respond to a phrase that follows a “hot word” such as “Hey, Google” or “Hey, Siri” and appear to respond almost immediately. In fact, the response has a delay of a fraction of a second, which is acceptable in a smart speaker device.

How can a small device be so clever?

The voice assistant uses a digital signal processor to digest the first “hot word.” The phrases that follow are sent via the Internet to the cloud.
The speech is then converted into streams of numbers, which are processed in a recurrent convolutional neural network that remembers previous internal states, so that it can be trained to recognize phrases or sequences of words.
These data streams are processed in a datacenter, and the answer or song requested is sent back to the voice assistant via the web. This works well in situations that are non-critical, where a delay does not matter and where Internet connections are reliable.

The neural networks located in data centers are trained using millions of samples in a method that resembles successive approximation; errors are initially very large, but are reduced by feeding the error back into an algorithm that adjusts the network parameters.
The error is reduced in each training cycle.
Training cycles are then repeated until the output is correct.
This is done for every word and phrase in the dataset. Training such networks can take a very long time, on the order of weeks.
Once trained, the network can recognize words and phrases spoken by different individuals.

The recognition process, called inference, is computed and requires millions of multiplications followed by accumulate (MAC) operations, which is why the information cannot be processed in a timely manner on a microprocessor within the device.

In keyword spotting, multiple words need to be recognized.
The delay of sending it to the datacenter is not acceptable, and Internet connections are not always guaranteed. Hence, local processing of phrases on the device is preferable.

One solution is to shrink the multiply-accumulate functions into smaller chips.
The Google Edge-Tensor Processing Unit (TPU), for instance, incorporates many array multipliers and math functions.
This solution still requires a microprocessor to run the neural network, but the MAC functions are passed on to the chip and accelerated.

While this approach allows a small microprocessor to run larger neural networks, it comes with disadvantages:
The power consumption remains too high for small or battery-powered appliances.
With diminishing size comes diminishing performance.
Small dedicated arrays of multipliers are not as plentiful or as fast as those provided by large, power-hungry GPUs or TPUs in datacenters.

An alternative approach involves smaller, tighter neural networks for keyword processing.
Rather than performing complex processing techniques in large recurrent networks, these networks process keywords by converting a stream of values into a spectrograph using a voice recognition algorithm known as MFCC.



The spectrograph picture is input to a much simpler 7-layer feed-forward neural network that has been trained to recognize the features of a keyword set.
The Google keyword dataset, for instance, consists of 65,000 one-second samples of 30 individual words spoken by thousands of different people.
Examples of keywords are UP, DOWN, LEFT, RIGHT, STOP, GO, ON and OFF.


An alternative approach
We have taken a completely different approach, processing sound, images, data and odors in event-based hardware. Brainchip was founded long before the current machine learning rage.

The advancement of processing methods for neural networks and artificial intelligence are our main aims, and we are focused on neuromorphic hardware designs.

The human brain does not run instructions, but instead relies on neural cells.
These cells process information and communicate in spikes, which are short bursts of electrical energy which express the occurrence of an “event” such as a change in color, a line, a frequency, or touch.

By contrast, computers are designed to operate on data bits and execute instructions written by a programmer.

These are two very different processing techniques.

It takes many computer instructions to emulate the function of brain cells — in the form of a neural network — on a computer.

We realized we could do away with the instructions and build very efficient digital circuits that compute in the same way the brain does.

The brain is the ultimate example of a general intelligent system.

This is exactly what Brainchip has done to develop the Akida neural processor.

The chip evolved further when we combined deep learning capabilities with the event-based spiking neural network (SNN) hardware, thus significantly lowering power requirements and improving performance — with the added advantage of rapid on-chip learning.

The Akida chip can process the Google keyword dataset, utilizing the simple 7-layer neural network described above, within a power budget of less than 200 microwatts.

Akida was trained using the the ImageNet dataset, enabling it to instantly learn to recognize a new object without expensive retraining.

The chip has built-in sparsity.
The all-digital design is event-based and therefore does not produce any output when the input stimulus does not cause the neuron to exceed the threshold.

This can be illustrated in a simplified, although extreme example.

Imagine an image with a single dot in the middle.

A conventional neural network needs to process every location of the image to determine if there is something there.
It takes a block of pixels from the image and performs a convolution.
The results are zero, and these zeros are propagated throughout the entire network, together with the zeros generated by all the other blocks, until it reaches the dot.
To detect and eliminate the zeros would add additional latency and would cause processing to slow down rather than speed it up.
Nearly 500 million operations are required to determine that there is a single dot in the image.

By contrast, the Akida event-based approach responds only to the one event, the single dot.

All other locations contain no information and zeros are not propagated through the network, because they do not generate an event.

In practical terms, with real images this sparsity results in up to 40 to 60 percent fewer computations to produce the same classification results using less power.

Training Akida
A keyword spotting application using the Akida chip trained on the Google Speech Commands Dataset can run for years off a penlight battery.

The same circuit configured to use 30 layers and all 80 neural processing units on the chip can be used to process the entire ImageNet dataset in real-time at less than 200 milliwatts (about five days on a penlight battery).

The MobileNet network for image classification fits comfortably on the chip, including all the required memory.
The on-chip, real-time learning capability makes it possible to add to the library of learned words, a nice feature that can be used for personalized word recognition like names, places and customized commands.
Another option for keyword spotting is the Syntiant NDP101 chip.

While this device also operates at comparable low power (200 microwatts) it is a dedicated audio processor that integrates an audio front end, buffering and feature extraction together with the neural network. Syntiant expects to replace digital MACs with an in-memory analog circuit in the future to further reduce power.

The Akida chip has the added advantages of on-chip learning and versatility. It can also be reconfigured to perform sound or image classification, odor identification or to classify features extracted from data. Another advantage of local processing is that no images or data are exposed on the Internet, significantly reducing privacy risks.

Applications for the technology range from voice-activated appliances to replacing worn-out components in manufacturing equipment.
The technology also could be used to determine tire wear based on the sound a tire makes on a road surface.

Other automotive applications include monitoring a driver’s alertness, listening to the engine to determine if maintenance is required and scanning for vehicles in the driver’s blind spot.

We expect Akida to evolve, incorporating the structures of the brain, particularly cortical neural networks aimed at artificial general intelligence (AGI).

This is a form of machine intelligence that can be trained to perform multiple tasks.

AGI technology can be used for controlling autonomous vehicles, with sufficient intelligence to control a vehicle and eventually learn to drive much like humans learn.
To be sure, there will be many intermediate steps along the way to that goal.


A future Akida device will include a more sophisticated neural network model that can lean increasingly complex tasks. Stay tuned.
— Peter AJ van der Made is the CTO of Brainchip.
That was a great explanation, Thanks Rayz
 
  • Like
Reactions: 10 users

Mugen74

Regular
  • Like
Reactions: 3 users
D

Deleted member 118

Guest
What a nice end to a long week, now time to catch up on all the shenanigans over at HC. Have a great safe weekend everyone


ABB178E9-20A7-4BB5-8061-05B413C64E64.jpeg
 
  • Like
Reactions: 33 users

Newk R

Regular
Call the dog sunny
I gave sunny a send off on HC. It was deleted of course. All I said was I hoped he'd shit his pants soon!!!!!;)
 
  • Like
  • Haha
Reactions: 15 users

Tothemoon24

Top 20
Hi chippers .
Great green day , I wasn’t expecting things to hold up so strongly for a Friday close .

Building a solid launching pad for next price sensitive Ann , can’t stop smiling theses days
 
  • Like
  • Fire
Reactions: 23 users

buena suerte :-)

BOB Bank of Brainchip
  • Like
Reactions: 15 users

JK200SX

Regular
I have mentioned it to Ken Scarince as a courtesy. I read TechLaden on HC this morning lamenting the number of threads and how hard it was for Tony Dawe to read them all. He then had a backhand about the added reading that this new Brainchip Resort (my description) would add.

I am therefore wondering if we might be able to create an intelligent company questions thread. Obviously there is that old saying there are no stupid questions but there are questions that new investors or explorers of Brainchip will have that we longer term holders can answer. There also will be questions that Brainchip cannot and will not answer such as who are the NDA companies.

But it must be possible (says the non techie) to collate the questions and then have the forum transfer the intelligent ones into a thread that Tony Dawe could easily read and possibly even post answers too. It would need to be a place where Tony Dawe is completely safe from being trolled or ridiculed about the answer he provides. So it would be say a zeebot as administrator posting the questions with Tony Dawe being the only one who could post a reply area.

(We could even have one for ManChildreborn giffs as I am sure Tony Dawe would see them as a must not miss. LOL.)

Any thoughts?

My opinion only DYOR
FF

AKIDA BALLISTA


I don't know if anyone here frequents the Whirlpool forums but perhaps something like a FAQ/Knowledge Base may be useful?

 
Last edited:
  • Like
Reactions: 4 users

Newk R

Regular
I've been suspended on HC..permanently!!!!!😂
 
  • Like
  • Haha
  • Wow
Reactions: 32 users

Bloodsy

Regular
Cheers to Zeebot for making this forum on the fly! Im SammyJB from HC, glad to be over here after the recent dumpster fire that HC has become….. Nice to see most people here (shout out Rocket nice to see you still kicking around mate 😂)

I sure hope it’s a little less Sunny over here 🤣
 
  • Like
  • Haha
Reactions: 23 users

Touchstone

Emerged
Hi all. Many thanks to Zeebot for this move. Bev from HC occasional poster and long term holder here. Much better atmosphere. Is a joy to read. Will make it easier to contribute I suspect.
 
  • Like
  • Love
Reactions: 17 users
Top Bottom