BRN Discussion Ongoing

At HC anyone who suggested there were such things as fake posters and paid posters from the finance industry were ridiculed at length by a group of individuals who either were part of the cabal or who genuinely believe in the innate goodness of people. The following linked article originally from the Washington Post might give us all cause to think that on every occasion here and elsewhere the idea of Doing Your Own Research like AKIDA is ESSENTIAL:


My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
  • Love
  • Thinking
Reactions: 14 users

Mt09

Regular
  • Fire
Reactions: 1 users
I have mentioned my memory before and how it plagues my life but was a valuable asset in my past careers.

Well that pesky memory thing is at play again and so in the interest of fairness and balance I thought I should give the final word to Milo:

“2. Akida is an AI chip and I think here he means that Akida can be used to process all the sensor data of the car. That doesn't mean Akida can outperform a GPU in areas the GPU's are good at. Do you think Akida can replace a GPU and able to handle the graphics you see in modern cars? Can Akida PCIe board replace a GPU and run modern games? If that were the case our Market cap won't be this by now.” - Milo

So fellow shareholders as Milo has stated AKIDA being able to replace GPUs the current market cap is no where near what it should be and Milo agrees with me that Brainchip is ridiculously oversold and under priced.

My opinion only DYOR
FF

AKIDA BALLISTA
It is quite strange how this orange highlighted line "If that were the case our Market cap won't be this by now. picks up echoes of the past when one of the WANCA's said "The way I look at it if it really worked then it would not be listed on the ASX." (not intended as an exact quote as simply from memory)

Just a coincidence of course because this fellow was a financial advisor not an electrical engineer.

My opinion only DYOR
FF

AKIDA BALLISTA
 
  • Like
Reactions: 14 users

alwaysgreen

Top 20
Yes Quadric but I remember someone mentioning a while back that they thought Quadric may be using Akida in their products. I think it was @Stable Genius

Not sure whatever came of that.
 
  • Like
Reactions: 3 users

wilzy123

Founding Member
this fellow was a financial advisor
laughing-too-funny.gif
 
  • Haha
  • Like
  • Love
Reactions: 5 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
SAY WHAT?!!!!!

View attachment 20700



Just wanted to revisit this Samsung announcement again. In another article I noticed it says Samsung will "collaborate with global partners to implement industry-leading central processing unit (CPU) and graphics processing unit (GPU) at the same time. They are also planning to develop an ultra-high-pixel image sensor close to the human eye and a sensor that detects and implements the five senses."

I highlighted "a sensor" (singular) because it indicates they're intending on developing a sensor that is capable of processing all five sensor modalities, which is obviously not the same as developing different sensors to process different senses.

Then I remembered there was a Rob Telson podcast (link below) in which he describes why Akida is so unique in it's ability in this regard.

Does anyone know of any other processor that is commercially available which is capable of mimicking the 5 senses like AKIDA?




Yes am.png

Screen Shot 2022-11-02 at 11.44.22 am.png

Screen Shot 2022-11-02 at 11.46.07 am.png





 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 45 users

Sirod69

bavarian girl ;-)
we will see what our own Rob Telson will say
 
  • Like
  • Love
  • Fire
Reactions: 20 users

Slade

Top 20
we will see what our own Rob Telson will say
Exciting times. Tip of the iceberg. If you could be any superhero who would you be and why?
 
  • Haha
  • Like
  • Love
Reactions: 15 users

BaconLover

Founding Member
1.5kg_kn41-ep.jpg


I'd be this superhero because I love football.
 
  • Haha
  • Like
  • Love
Reactions: 23 users

Sam

Nothing changes if nothing changes
  • Like
  • Fire
  • Love
Reactions: 11 users

Sirod69

bavarian girl ;-)
1667351711601.png

because I love Cats
 
  • Love
  • Like
  • Haha
Reactions: 9 users

mrgds

Regular
Exciting times. Tip of the iceberg. If you could be any superhero who would you be and why?
And dont forget, ..................... " oh, great question " ....................:rolleyes:
 
  • Like
  • Love
Reactions: 3 users

VictorG

Member
I used to be a superhero then I retired. 🦸‍♂️🦸‍♂️🦸‍♂️

"The older I get, the more clearly I remember things that never happened. - Mark Twain
 
  • Haha
  • Like
  • Love
Reactions: 18 users

jtardif999

Regular
It’s funny, nothing related to brn but there were companys in the dot com crash that did make it out and are worth billions at the moment
I tend to think that inflationary pressures suppressing industry and the demise of Argo may well play into BRNs hands in the longer run. I see an opportunity for the right technology in the right place at the right time. And Akida is the right technology imo, it’s cheap and scales well as IP, it’s the only real edge choice atm and I think with a suppressed market and the need to keep moving ADAS and level 3 automation ahead, that industry will gravitate towards a modular autonomous solution of necessary components making up what is required. I think this has happened with software in the past (many times) and now it will be forced upon software defined car manufacturers to reduce the amount of reinventing in development they do in favour of purchasing more generic modular solutions - to speed up adoption. I think Akida will be part of this why? Not because I’m a biased shareholder but because Akida scales and will be cheaper and faster to implement. I think that eventually there will be maybe 2 or 3 major autonomous vehicle platforms adopted by the whole of industry and each will utilise Akida technology - fulfilling the companies ambition for Akida to be the defacto standard for car edge AI. AIMO.
 
  • Like
  • Fire
  • Love
Reactions: 26 users

SERA2g

Founding Member
Exciting times. Tip of the iceberg. If you could be any superhero who would you be and why?
Yeh, he needs to lose the superhero thing.

If he wants an ice breaker it should be at the start of the podcast, not at the end, and shouldn’t be as lame. Lol.

This is not financial advice.
 
  • Like
  • Haha
  • Love
Reactions: 13 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I'd be Julia Guillard so I could go around reciting the misogyny speech to ding-bats like Izzzzy all day.

index.jpg



Screen Shot 2022-11-02 at 12.56.44 pm.png
 
  • Haha
  • Like
  • Fire
Reactions: 32 users
Just wanted to revisit this Samsung announcement again. In another article I noticed it says Samsung will "collaborate with global partners to implement industry-leading central processing unit (CPU) and graphics processing unit (GPU) at the same time. They are also planning to develop an ultra-high-pixel image sensor close to the human eye and a sensor that detects and implements the five senses."

I highlighted "a sensor" (singular) because it indicates they're intending on developing a sensor that is capable of processing all five sensor modalities, which is obviously not the same as developing different sensors to process different senses.

Then I remembered there was a Rob Telson podcast (link below) in which he describes why Akida is so unique in it's ability in this regard.

Does anyone know of any other sensor that is commercially available which is capable of mimicking the 5 senses like AKIDA?




View attachment 20967
View attachment 20969
View attachment 20970




Hi Bravo
Love your work but remember as Brainchip states "We don't make sensors we make them smart." AKIDA is a processor. A GPU is a processor. A CPU is a processor. None of them are sensors but sensors need something to process what they sense and make it intelligible to humans.

Now you can use multiple GPU's or multiple CPU's to process the data coming from five sensors or more and send it somewhere else to be fused into a meaningful action or you can use something that takes in multiple streams of different sensory data and fuses that data on chip and gives you the meaningful action close to the sensors.

By coincidence this ability to take multiple sensor imputes and fuse them on chip close to the sensor and produce meaningful action is something AKIDA technology IP provides. AKIDA IP however will not be the sensor itself.

Remember the Luca Verre CEO of Propehesee interview where he described building their event based sensor but knowing that it was only half the story unless they could find someone with an event based processor.

Intel did not have it.

SynSense did not have it.

But then,

'One enchanted evening,
Then Luca found AKIDA,
And somehow he knew,
With AKIDA he'd be sensing,
And he made it his own'.

(sung to the tune Some Enchanted Evening from South Pacific)

My opinion only DYOR
FF

AKIDA BALLISTA
 
Last edited:
  • Like
  • Love
  • Fire
Reactions: 40 users

Sirod69

bavarian girl ;-)
Get ready for an exciting discussion of the latest trends in #AI: this time, you can learn from us and Arm about the capabilities of the Ethos-U65 microNPU combined with Cortex-A55 CPUs on NXP's new i.MX 93 applications processor! 🤓 Register today and join us on November 8th at 5 pm (CET):

 
  • Like
  • Love
  • Fire
Reactions: 20 users

Diogenese

Top 20

What do electrical engineers do?​


engineering students behind laptop

Harness the Power of Electricity​

Electrical engineers create, design and manage electricity to help power the world. They are problem-solvers who study and apply the physics and mathematics of electricity, electromagnetism and electronics to both large- and small-scale systems to process information and transmit energy. Electrical engineers work with all kinds of electronic devices which transform society, from the smallest pocket devices to large power stations and supercomputers.
At UNSW School of Electrical Engineering and Telecommunications, we help our students learn through a combination of design and lab work. This mix of theory and practical application helps students visualise concepts and apply their ideas in real-life situations. Students learn to do what an electrical engineer does day-to-day: analyse and diagnose a problem and develop an innovative solution.

Electrical Engineering Industries​

Electrical engineers mostly work with large-scale electrical systems such as motor control and power generation and transmission. They use a diverse range of technologies, from the lighting and wiring of buildings, to design of household appliances, telecommunication systems, electrical power stations and satellite communications. In the emerging field of microelectronics, electrical engineers design or develop electrical systems and circuits in computers and mobile devices.
Graduates however aren’t just limited to these industries. Our degrees are structured in ways that encourage analytical thinking, help master time management and ensure students are technically proficient. Because of this, electrical engineers from UNSW are in high demand even in areas such as:
  • Renewable energy
  • Global Positioning System (GPS) technologies
  • Mobile networking
  • Banking
  • Finance
  • Arts
  • Management
  • Consulting”
Electrical Engineers are pretty impressive people don’t you think?

They are clearly highly intelligent and capable of in-depth research in the above stated areas and would be tuned into a range of on line professional services where they could access all sorts of technical and scientific information that the lay person would not have access to.

Fact Finder a recognised technophobe and retired lawyer with no background in the sciences beyond high school only has access to what he finds on line and what Brainchip releases and of course to a brilliant retired consulting engineer @Diogenese.

You might think therefore that a brilliant electrical engineer of the type described above by the University of NSW could do a whole lot better by this group than simply stating “I don’t believe you” as a rebuttal argument to Fact Finder who is doing no more than parroting back what has been publicly stated by others similarly skilled in the art as the electrical engineer.

Now of course as Fact Finder has never been privy to an intellectual debate between electrical engineers skilled in the art of neuromorphic computing perhaps this is how it is done.

One states a theory and the other responds “I don’t believe you” and the other says “Ok I must be wrong” then they all go to the pub.

My error in this has been to cause Fact Finder to believe such arguments would be more like legal debates where one lawyer states a proposition put forward by the superior court and the other lawyer responds with a contrary position supported by another superior court. But of course this is the law and not science.

The thing that we all know is that on the Brainchip website there is free access to the Meta TF which allows you to explore the AKIDA technology revolution in the privacy of your own electrical engineering workshop and also direct questions to Brainchip engineers???

The other thing we all know is that though the AKD1000 chip was not available in 2019 the AKIDA IP was released to select early access customers from around July, 2019 and was being implemented in an FPGA for internal purposes.

We also know that laying around in Peter van der Made’s lab was Studio and Studio Accelerator both being earlier iterations of the AKIDA technology.

Though not skilled in the art I think it is reasonable to accept that Peter van der Made the inventor of the AKIDA technology had access to sufficient material to make statements about what his AKIDA technology could do.

So it might be thought that the blistering response from the electrical engineer that ‘na na’ he did not have AKD1000 in 2019 fails to live up to even cursory examination.

My opinion only DYOR
FF

AKIDA BALLISTA
PvdM has been working on SNNs since at least 2008, so I suspect that by 2019 he would have begun to get a glimmer of understanding of its capabilities.

https://brainchip.com/brainchip-releases-client-server-interface-tool-for-snap-technology/

BrainChip releases client server interface tool for snap technology 15.03.2016​

...
The SNAP neural network learns features that exist in the uploaded data, even when they are not distinguishable by human means. Autonomous machine learning has long been an elusive target in computer science. Recursive programs are cumbersome and take a long time to process. BrainChip has accomplished rapid autonomous machine learning in its patented hardware-only solution by replicating the learning ability of the brain, by re-engineering the way neural networks function, and by creating a new way of computing culminating in the SNAP technology.

It is possible to trace the development of Akida through the BrainChip patents, listed here:

https://worldwide.espacenet.com/patent/search/family/070458523/publication/US11468299B2?q=pa = "brainchip"

This is a US patent derived from PvdM's first NN patent application:

US10410117B2 Method and a system for creating dynamic neural function libraries: Priority 20080921

1667351580293.png


A method of creating a reusable dynamic neural function library for use in artificial intelligence, the method comprising the steps of:
sending a plurality of input pulses in form of stimuli to a first artificial intelligent device, where the first artificial intelligent device includes a hardware network of reconfigurable artificial neurons and synapses;
learning at least one task or a function autonomously from the plurality of input pulses, by the first artificial intelligent device;
generating and storing a set of control values, representing one learned function, in synaptic registers of the first artificial intelligent device;
altering and updating the control values in synaptic registers, based on a time interval and an intensity of the plurality of input pulses for autonomous learning of the functions, thereby creating the function that stores sets of control values, at the first artificial intelligent device; and
transferring and storing the function in the reusable dynamic neural function library, together with other functions derived from a plurality of artificial intelligent devices, allowing a second artificial intelligent device to reuse one or more of the functions learned by the first artificial intelligent device
.

... and this is the key patent which was granted recently:

US11468299B2 Spiking neural network: Priority 20181101

1667351913830.png


A system, method, and computer program product embodiments for an improved spiking neural network (SNN) configured to learn and perform unsupervised extraction of features from an input stream. An embodiment operates by receiving a set of spike bits corresponding to a set synapses associated with a spiking neuron circuit. The embodiment applies a first logical AND function to a first spike bit in the set of spike bits and a first synaptic weight of a first synapse in the set of synapses. The embodiment increments a membrane potential value associated with the spiking neuron circuit based on the applying. The embodiment determines that the membrane potential value associated with the spiking neuron circuit reached a learning threshold value. The embodiment then performs a Spike Time Dependent Plasticity (STDP) learning function based on the determination that the membrane potential value of the spiking neuron circuit reached the learning threshold value.

This one is for detecting partially obscured objects, quite handy in the real world:

US11151441B2 System and method for spontaneous machine learning and feature extraction: Priority 20170208

1667352746339.png


an artificial neural network system for improved machine learning, feature pattern extraction and output labeling. The system comprises a first spiking neural network and a second spiking neural network. The first spiking neural network is configured to spontaneously learn complex, temporally overlapping features arising in an input pattern stream. Competitive learning is implemented as Spike Timing Dependent Plasticity with lateral inhibition in the first spiking neural network. The second spiking neural network is connected with the first spiking neural network through dynamic synapses, and is trained to interpret and label the output data of the first spiking neural network. Additionally, the output of the second spiking neural network is transmitted to a computing device, such as a CPU for post processing.


Accurate detection of objects is a challenging task due to lighting changes, shadows, occlusions, noise and convoluted backgrounds. Principal computational approaches use either template matching with hand-designed features, reinforcement learning, or trained deep convolutional networks of artificial neurons and combinations thereof. Vector processing systems generate values, indicating color distribution, intensity and orientation from the image. These values are known as vectors and indicate the presence or absence of a defined object. Reinforcement learning networks learn by means of a reward or cost function. The reinforcement learning system is configured to either maximize the reward value or minimize the cost value and the performance of the system is highly dependent on the quality and conditions of these hand-crafted features.

Deep convolutional neural networks learn by means of a technique called back-propagation, in which errors between expected output values for a known and defined input, and actual output values, are propagated back to the network by means of an algorithm that updates synaptic weights with the intent to minimize the error.

The Deep Learning method requires millions of labelled input training models, resulting in long training times, and clear definition of known output values.

However, these methods are not useful when dealing with previously unknown features or in the case whereby templates are rapidly changing or where the features are flexible. The field of neural networks is aimed at developing intelligent learning machines that are based on mechanisms which are assumed to be related to brain function. U.S. Pat. No. 8,250,011 [BrainChip] describes a system based on artificial neural network learning. The system comprises a dynamic artificial neural computing device that is capable of approximation, autonomous learning and strengthening of formerly learned input patterns. The device can be trained and can learn autonomously owing to the artificial spiking neural network that is intended to simulate or extend the functions of a biological nervous system. Since, the artificial spiking neural network simulates the functioning of the human brain; it becomes easier for the artificial neural network to solve computational problems.

[0003] US20100081958 (Lapsed) [Florida Uni] describes a more advanced version of machine learning and automated feature extraction using neural network. US20100081958 is related to pulse-based feature extraction for neural recordings using a neural acquisition system. The neural acquisition system includes the neural encoder for temporal-based pulse coding of a neural signal, and a spike sorter for sorting spikes encoded in the temporal-based pulse coding. The neural encoder generates a temporal-based pulse coded representation of spikes in the neural signal based on integrate-and-fire coding of the received neural signal and can include spike detection and encode features of the spikes as timing between pulses such that the timing between pulses represents features of the spikes.

[0004] However, the prior art do not disclose any system or method which can implement machine learning or training algorithm and can autonomously extract features and label them as an output without implementing lengthy training cycles. In view of the aforementioned reasons, there is therefore a need for improved techniques in spontaneous machine learning, eliminating the need for hand-crafted features or lengthy training cycles. Spontaneous Dynamic Learning differs from supervised learning in that known input and output sets are not presented, but instead the system learns from repeating patterns (features) in the input stream
.

US2010081958A1 PULSE-BASED FEATURE EXTRACTION FOR NEURAL RECORDINGS relates to neurophysiology, I guess it's to do with those skull cap neuron detectors, which shows the depth of PvdM's research.



It is interesting that US20100081958 (Lapsed) [Florida Uni] is cited for its discussion of spike sorting because BrainChip's US11468299B2 Spiking neural network also has a spike sorting system:


1667353854608.png


It doesn't look like the Florida Uni document discloses anything like PvdM's spike sorter.

1667354716592.png



Sorry, I seem to have wandered down a different rabbit hole from the one I started in ...


... oh yes, does PvdM know the capabilities of Akida?

I think even he will be astonished when he gets the cortex sorted out and produces AGI, or maybe, justly hugely proud of his achievements.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 46 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hi Bravo
Love your work but remember as Brainchip states "We don' make sensors we make them smart." AKIDA is a processor. A GPU is a processor. A CPU is a processor. None of them are sensors but sensors need something to process what they sense and make it intelligible to humans.

Now you can use multiple GPU's or multiple CPU's to process the data coming from five sensors or more and send it somewhere else to be fused into a meaningful action or you can use something that takes in multiple streams of different sensory data and fuses that data on chip and gives you the meaningful action close to the sensors.

By coincidence this ability to take multiple sensor imputes and fuse them on chip close to the sensor and produce meaningful action is something AKIDA technology IP provides. AKIDA IP however will not be the sensor itself.

Remember the Luca Verre CEO of Propehesee interview where he described building their event based sensor but knowing that it was only half the story unless they could find someone with an event based processor.

Intel did not have it.

SynSense did not have it.

But then,

'One enchanted evening,
Then Luca found AKIDA,
And somehow he knew,
With AKIDA he'd be sensing,
And he made it his own'.

(sung to the tune Some Enchanted Evening from South Pacific)

My opinion only DYOR
FF

AKIDA BALLISTA
Whooopsies! 🥴Thanks for pointing this out FF! I have edited my post to make it more accurate.
 
  • Like
Reactions: 7 users
Top Bottom