AI_Inquirer
Regular
Apologies if this has already been shared or is not a recent update, but is the image on the Brainchip CES page showcasing the Onsemi demo a new addition?
Exactly what I tell the people in the German forum. But I guess this is the power of the brand. It’s like buying the brand chocolate in the expensive supermarket and saying it’s delicious, while you can have the same chocolate in a cheaper supermarket with a different branding and saying it tastes a bit strange…Hi All
I am loath to bother stating the obvious because it will be attacked and lies will be spread far and wide for reasons that others need to judge but what the heck:
Fact 1: Mercedes Benz with a $72 billion market cap in a magazine article is reported as saying at CES 2022 that it is working with Brainchip trialling AKIDA technology in a concept vehicle not intended for production and makes some positive comments. Before these magazine reports can be verified the share price commences to rise terminating at $2.34 before commencing to fall.
Fact 2: OnSemi, Microchip and Infineon with a combined $122.10 billion market cap undertake joint demonstrations of their respective technologies working with Brainchip AKIDA technology at CES 2023 for a range of mass consumption use cases not concepts and company representatives of each come on publicly released Brainchip podcasts and go on record permanently confirming they are actually partnered for these demonstrations and speak highly of the individual outcomes and the Brainchip share price drops to 16 cents.
My opinion only so DYOR but the logic of the above is difficult to understand.
Fact Finder
As stated before I think at the AGM ,Exactly what I tell the people in the German forum. But I guess this is the power of the brand. It’s like buying the brand chocolate in the expensive supermarket and saying it’s delicious, while you can have the same chocolate in a cheaper supermarket with a different branding and saying it tastes a bit strange…
This information needs to be absorbed by all who are complaining.Exactly what I tell the people in the German forum. But I guess this is the power of the brand. It’s like buying the brand chocolate in the expensive supermarket and saying it’s delicious, while you can have the same chocolate in a cheaper supermarket with a different branding and saying it tastes a bit strange…
This is a link that should be opened and the full article read.View attachment 54375
The Brainchip Akida Neuromorphic System-on-Chip, which is a low-power and scalable chip that can integrate multiple DVS and DAS sensors and perform event-based learning and inference.
![]()
Dynamic Vision Sensor (DVS) and the Dynamic Audio Sensor (DAS)
Have you ever wondered how the human eye👁️ and ear👂 can process complex and dynamic scenes with such high speed and accuracy? Imagine if we could design artificial sensors that mimic the biological mechanisms of vision and hearing, and produce data that is more efficient and meaningful than convenwww.linkedin.com
For those not on LinkedInView attachment 54375
The Brainchip Akida Neuromorphic System-on-Chip, which is a low-power and scalable chip that can integrate multiple DVS and DAS sensors and perform event-based learning and inference.
![]()
Dynamic Vision Sensor (DVS) and the Dynamic Audio Sensor (DAS)
Have you ever wondered how the human eye👁️ and ear👂 can process complex and dynamic scenes with such high speed and accuracy? Imagine if we could design artificial sensors that mimic the biological mechanisms of vision and hearing, and produce data that is more efficient and meaningful than convenwww.linkedin.com
This is a link that should be opened and the full article read.
My opinion only DYOR
Fact Finder
It’s certainly a big step up but the expectations were very different back then.Hi All
I am loath to bother stating the obvious because it will be attacked and lies will be spread far and wide for reasons that others need to judge but what the heck:
Fact 1: Mercedes Benz with a $72 billion market cap in a magazine article is reported as saying at CES 2022 that it is working with Brainchip trialling AKIDA technology in a concept vehicle not intended for production and makes some positive comments. Before these magazine reports can be verified the share price commences to rise terminating at $2.34 before commencing to fall.
Fact 2: OnSemi, Microchip and Infineon with a combined $122.10 billion market cap undertake joint demonstrations of their respective technologies working with Brainchip AKIDA technology at CES 2023 for a range of mass consumption use cases not concepts and company representatives of each come on publicly released Brainchip podcasts and go on record permanently confirming they are actually partnered for these demonstrations and speak highly of the individual outcomes and the Brainchip share price drops to 16 cents.
My opinion only so DYOR but the logic of the above is difficult to understand.
Fact Finder
Dynamic Vision Sensor (DVS) and the Dynamic Audio Sensor (DAS)
Published on Jan 16, 2024
Powered by AI and the LinkedIn community
Kailash PrasadDesign Engineer @ Arm | PMRF | IIRF | nanoDC Lab…
Published Jan 16, 2024
Follow
Have you ever wondered how the human eyeand ear
can process complex and dynamic scenes with such high speed and accuracy? Imagine if we could design artificial sensors that mimic the biological mechanisms of vision and hearing, and produce data that is more efficient and meaningful than conventional sensors.
In this post, I will introduce you to two types of neuromorphic sensors: the Dynamic Vision Sensor (DVS) and the Dynamic Audio Sensor (DAS).
These sensors are inspired by the structure and function of the retina and the cochlea, respectively, and use a novel paradigm of event-based sensing. Unlike conventional sensors that capture frames or samples at a fixed rate, event-based sensors only output data when there is a change in the input signal, such as brightness or sound intensity. This results in a stream of asynchronous events that encode the temporal and spatial information of the scene, with high temporal resolution, low latency, and high dynamic range.
- "In simpler terms, these special sensors work like our eyes and ears. They're designed based on the way our eyes' retinas and ears' cochleae function. But what sets them apart is their unique approach called event-based sensing. Unlike regular sensors that take pictures or recordings at a set speed, these event-based sensors only provide information when there's a change. Whether it's a shift in light or a change in sound, they only capture those moments. Instead of a constant flow of data, you get quick updates that show when and where things change. This gives you highly detailed and fast information about what's happening, with minimal delay and a wide range of details. It's like having sensors that focus on the important stuff, making them efficient and responsive."
The DVS is an imaging sensor that responds to local changes in brightness, and outputs events that indicate the pixel address, the polarity (increase or decrease) of the brightness change, and the timestamp. The DVS can achieve a temporal resolution of microseconds, a dynamic range of 120 dB
, and a low power consumption of 30 mW
. The DVS can also avoid motion blur and under/overexposure that plague conventional cameras. The DVS can be used for applications such as optical flow estimation, object tracking, gesture recognition, and robotics.
The DAS is an auditory sensor that mimics the cochlea, the auditory inner ear. The DAS takes stereo audio inputs and outputs events that represent the activity in different frequency ranges. The DAS can capture sound signals with a frequency range of 20 Hz to 20 kHz, a dynamic range of 60 dB
, and a temporal resolution of microseconds
. The DAS can also extract auditory features such as interaural time difference, harmonicity, and speaker identification.
Both the DVS and the DAS are compatible with neuromorphic computing architectures, such as spiking neural networks, that can process the event data in a parallel and distributed manner. This enables low-power and real-time computation of complex tasks such as scene understanding, speech recognition, and sound localization.
Some examples of recent products that use the DVS and the DAS are:
- The Prophesee Metavision Camera, which is a high-resolution DVS camera that can capture fast and complex motions with minimal data and power consumption.
For those not on LinkedIn
View attachment 54376 View attachment 54377 View attachment 54378 View attachment 54379
I only hope they weren't put off by all the childish spamming and idiotic, relentless questions about brainchip on social media etc... I really do hope thats not the case. Im worried we have infact become a meme stock... Just sit back and let management do their thing. Ive said it from the start...
Edit: This is not directed at you Bravo. Love your work. Mainly a certain" verification engineer"...all over the network.
Yes accept all of your reasoning but what logic saw investors find a reason to sell off and drop the price to 16 cents?It’s certainly a big step up but the expectations were very different back then.
Mercedes was the first prestigious company that actually used the chip for something and told the masses about it on the big stage to announce their concept car. Back then revenue was so far away and this announcement caught everyone of us off guard. Today, due to the statements made by the management our expectations are lots of ip licenses along with revenue growth. So these demos and confirmations are certainly nice to have but they really don’t matter a lot for the vast majority of the investors. The share price acknowledges what we expect and it’s not what we’re getting right now.
I’m positive nevertheless and these demos show that it’s just a matter of time.
OK, so I said " I'm VERY disappointed that the share price has gone nowhere after the C.E.S., to which Wiltzy said "Where was it supposed to go? LOL. Perhaps I am the naive one, and you will enlighten me as to where it should have gone and why."
Fullmoonfever has clarified and I believe he's correct i.e. CES is a trade show & that's where I was wrong Wiltzy.
NOW, trying to be constructive, I still believe there was an opportunity missed at the CES.
I suggest that at future shows like the CES, we could have the best of both worlds.
Have a person or persons demoing, and another discussing investing in us. Have the booth clearly set up between the two and I reckon some visitors would see the demo, go off, thinking, then, realising the potential, and having subliminally noticed the investing idea, would come back for info. We could also be a bit naughty by employing a crowd enthusiastically jostling and calling out "how can I invest".![]()
I guess most investors expect big announcements during these trade shows but honestly I cannot explain it.Yes accept all of your reasoning but what logic saw investors find a reason to sell off and drop the price to 16 cents?
What was the bad news that caused the loss of confidence?
Your thinking and mine align except that I am not asking why the price did not explode and go to $2.34 and beyond but why the engagement with these three companies is considered a negative.
****************
By the way Kailash PrasadDesign Engineer @ Arm in the following confirms the statements by the representative of Infineon at CES 2024 that what sets AKIDA apart is its capacity to scale and fuse multiple inputs while providing low powered inference:
“The Brainchip Akida Neuromorphic System-on-Chip, which is a low-power and scalable chip that can integrate multiple DVS and DAS sensors and perform event-based learning and inference”
My opinion only DYOR
Fact Finder
So while CES 24 was saying "Look over here!" the real action was going on at the VLSI Design Conference.For those not on LinkedIn
View attachment 54376 View attachment 54377 View attachment 54378 View attachment 54379