BRN Discussion Ongoing

Hey GUYS

I'M BACK. long story and i have so much catching up to do. But all is well now and I'm back :) If someone can summarise in a nutshell what i missed :cool:
 

Attachments

  • 5256C271-906F-4AA4-9A46-383562FEEC67.jpeg
    5256C271-906F-4AA4-9A46-383562FEEC67.jpeg
    1.2 MB · Views: 77
  • Haha
  • Like
Reactions: 5 users
Last edited:
  • Like
  • Haha
  • Love
Reactions: 6 users
D

Deleted member 118

Guest
Just thought I’d pop in and say

 
  • Like
  • Haha
  • Love
Reactions: 9 users
  • Like
  • Fire
  • Love
Reactions: 8 users
Hey GUYS

I'M BACK. long story and i have so much catching up to do. But all is well now and I'm back :) If someone can summarise in a nutshell what i missed :cool:
Post in thread 'BRN Discussion Ongoing' https://thestockexchange.com.au/threads/brn-discussion-ongoing.1/post-225240
Link to PDFs regarding the LinkedIn post as you won't be able to download them from linkedin
And optimal time to top up is shortly after @HopalongPetrovski pulls the trigger.😉
 
Last edited:
  • Like
  • Haha
Reactions: 7 users
D

Deleted member 118

Guest
Hey GUYS

I'M BACK. long story and i have so much catching up to do. But all is well now and I'm back :) If someone can summarise in a nutshell what i missed :cool:
 
  • Haha
  • Love
Reactions: 6 users

buena suerte :-)

BOB Bank of Brainchip
Hey GUYS

I'M BACK. long story and i have so much catching up to do. But all is well now and I'm back :) If someone can summarise in a nutshell what i missed :cool:
What Mugen74 said !! :love: And ........Intel......... love us :cool:
 
  • Like
  • Fire
  • Love
Reactions: 14 users
Hi Jesse,

It's next year!

https://www.prophesee.ai/2022/06/20/brainchip-partners-with-prophesee/
Laguna Hills, Calif. – June 14, 2022 – BrainChip Holdings Ltd (ASX: BRN, OTCQX: BRCHF, ADR: BCHPY), the world’s first commercial producer of neuromorphic AI IP, and Prophesee, the inventor of the world’s most advanced neuromorphic vision systems, today announced a technology partnership that delivers next-generation platforms for OEMs looking to integrate event-based vision systems with high levels of AI performance coupled with ultra-low power technologies.
...
We’ve successfully ported the data from Prophesee’s neuromorphic-based camera sensor to process inference on Akida with impressive performance,” said Anil Mankar, Co-Founder and CDO of BrainChip. “This combination of intelligent vision sensors with Akida’s ability to process data with unparalleled efficiency, precision and economy of energy at the point of acquisition truly advances state-of-the-art AI enablement and offers manufacturers a ready-to-implement solution.”

“By combining our Metavision solution with Akida-based IP, we are better able to deliver a complete high-performance and ultra-low power solution to OEMs looking to leverage edge-based visual technologies as part of their product offerings, said Luca Verre, CEO and co-founder of Prophesee
.”



Back in 2021, Qualcomm thought their new Snapdragon would include Nuvia/ARM refinements:

https://www.pcworld.com/article/552...3-as-the-rebirth-of-its-snapdragon-chips.html

Qualcomm prophesizes 2023 as the rebirth of PC Snapdragon chips​

Nuvia CPUs and desktop gaming graphics? Qualcomm thinks its future is bright.
Mark Hachman

By Mark Hachman
Senior Editor, PCWorld NOV 16, 2021 10:30 AM PST

Qualcomm processors for PCs enhanced by the company’s Nuvia design team will sample in 2022 for devices shipping in 2023, Qualcomm executives said Tuesday. The company also boldly pledged to offer Adreno graphics that could compete with desktop PCs.

At the company’s 2021 investor day in New York, Dr. James Thompson, chief technology officer at Qualcomm, offered an overview of the company’s technology roadmap in several areas. A key focus, naturally, will be how and when Qualcomm’s Snapdragon processors will integrate the Nuvia design team, an Arm CPU developer that Qualcomm acquired in January.

Processor development takes time, however, and that integration won’t happen immediately. “They’re pretty far along at this point,” Thompson said, presumably talking about the first Snapdragon processors featuring Nuvia technology. “We’ll be sampling a product nine months from now, or something like that
.”
...
Thompson also claimed that the company’s graphics technology was on pace to improve, too. In terms of the Adreno integrated graphics core onboard the Snapdragon chips, Qualcomm performs somewhat better against the competition than its CPUs at present—somewhere between an 8th-gen and a 10th-gen Intel Core processor, when measured by the 3DMark “Night Raid” benchmark.
Thompson, though, said that Qualcomm could do better. “I just want to make it clear that our graphics will scale up to desktop-style gaming capabilities,” he told investors. He didn’t elaborate further
.


https://en.wikipedia.org/wiki/Adreno
Adreno (an anagram of AMD's graphic card brand Radeon), was originally developed by ATI Technologies and sold to Qualcomm in 2009 for $65M,[1][2] and was used in their mobile chipset products. Early Adreno models included the Adreno 100 and 110, which had 2D graphics acceleration and limited multimedia capabilities. At the time, 3D graphics on mobile platforms were commonly handled using software-based rendering engines, which limited their performance. With growing demand for more advanced multimedia and 3D graphics capabilities, Qualcomm licensed the Imageon IP from AMD, in order to add hardware-accelerated 3D capabilities to their mobile products.[3] Further collaboration with AMD resulted in the development of the Adreno 200, originally named the AMD Z430, based on the R400[4] architecture used in the Xenos GPU of the Xbox 360 video game console[5] and released in 2008, which was integrated into the first Snapdragon SoC. In January 2009, AMD sold their entire Imageon handheld device graphics division to Qualcomm.[6]

So Qualcomm use an in-house graphics core, which was initially designed by Imhotep.

Given Qualcomm's legal bust-up with ARM, I would think they would be scrambling to cover their embarrassment, and it's possible that Prophesee may be the link to Akida.
Why, when I look at Marks profile pic do I get a Zoolander vibe...

63c8d295809e6374136331.gif


1674107160412.png


Edit...thought throw in the original.
 
Last edited:
  • Haha
  • Like
Reactions: 17 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
I'm back too - from my toilet break.


cat-computer.gif



PS: Great to have you back @MDhere!
 
Last edited:
  • Haha
  • Like
  • Love
Reactions: 31 users

JB49

Regular
@chapman89 and @Diogenese

I’m off work with an injured back at the moment so I have a bit of time up my sleeve.

I‘m watching the rest of the Qualcomm video. It gets interesting at the 50.12 mark when he starts talking about ”Always on” reading just 1’s and zeros and neural network. He talks about support for third party neural network experiences.

:)


View attachment 27496


OK well this one definitely fills me with confidence!! 😂😂
 
  • Like
Reactions: 5 users

HUSS

Regular
Hi JB

It's great that Mercedes fills you with confidence!

For what it's worth, I used Markus Schafer's comments regarding there being 'a long way to go' for context as it creates the context for Heinrich Gotzig's response "Thanks a lot for this very interesting article. I can confirm that it is a long way to go but very promising."

These comments don't diminish Valeo's progress in commercialising a product containing akida. They simply confirm that current neuromorphic technologies still have a long way to being a 'brain on a chip' that is comparable to the human brain which contains 100 billion neurons operating at 20 watts of power.

Cheers!
Yes correct 100%. Long way for full brain an a chip which will take few years of course and BrainChip already started paving this way already and been already successful. This been confirmed by many industry leaders and also we are 3 years min. Ahead them!!. But that doesn’t mean Valeo and MB are not using already our Akida technology which contains fraction of brain in many industries like Automotive, IoT, medical, robotics, industrial.

You don’t need to wait for having full brain on a chip to start using the technology! As we have many examples and uses already proofed and shown by our partners. I can’t imagine what would be the use cases when we will have full brain on a chip!!? plus BRN already in that technology race and also one of its leaders tooo! Because that needs alot of work, efforts and collaboration with our ecosystem partners.

Cheers
 
  • Like
  • Fire
  • Love
Reactions: 19 users
@chapman89 and @Diogenese

I’m off work with an injured back at the moment so I have a bit of time up my sleeve.

I‘m watching the rest of the Qualcomm video. It gets interesting at the 50.12 mark when he starts talking about ”Always on” reading just 1’s and zeros and neural network. He talks about support for third party neural network experiences.

:)


View attachment 27496




Just in case you missed it!

1674106321831.png



I’m hoping that’s my early retirement right there!

The use case example he gave walked and quacked like a duck.

@Diogenese

:)

Edit: of course it is just speculation. Until there is word from the company to confirm it then it’s just wild theories from an anonymous Brainchip investor who has a bias. Given there was an opportunity to promote the new technology during this presentation and Brainchip wasn’t mentioned then either they’re using their own technology or want to keep the magic secret to themselves!

So long as the money shows up in the quarterly I’m now fazed how it gets there!
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 30 users
D

Deleted member 118

Guest
  • Haha
  • Like
Reactions: 4 users
Just in case you missed it!

View attachment 27504


I’m hoping that’s my early retirement right there!

The use case example he gave walked and quacked like a duck.

@Diogenese

:)

Edit: of course it is just speculation. Until there is word from the company to confirm it then it’s just wild theories from an anonymous Brainchip investor who has a bias. Given there was an opportunity to promote the new technology during this presentation and Brainchip wasn’t mentioned then either they’re using their own technology or want to keep the magic secret to themselves!

So long as the money shows up in the quarterly I’m now fazed how it gets there!

Of course a simple Google search revealed Qualcomm has it’s own Neural Software:

1674107702505.png


So I could be guilty of jumping the gun on this one and providing poorly researched information!

Sorry. :(


Edit: I might blame my pain meds!
 
Last edited:
  • Sad
  • Fire
  • Haha
Reactions: 9 users
Of course a simple Google search revealed Qualcomm has it’s own Neural Software:

View attachment 27506

So I could be guilty of jumping the gun on this one and providing poorly researched information!

Sorry. :(

But I just quit my job… what the….

🤣😂

Also tomorrow is Friday. Get some lube ready for the usual reaming 😱😎🤣
 
  • Haha
  • Like
Reactions: 15 users

toasty

Regular
Of course a simple Google search revealed Qualcomm has it’s own Neural Software:

View attachment 27506

So I could be guilty of jumping the gun on this one and providing poorly researched information!

Sorry. :(


Edit: I might blame my pain meds!
I think what they have been using previously is software based CNN's. I'm in the camp that says this looks like AKIDA. If its not then there may be patent infringements here!!! Always on, reacts only to change events, super low power consumption, AI processor shown in the block diagram, and the camera module is from our partner Prophesee.........this definitely walks like a duck.........

My mental ramblings only.....DYOR and make up your own mind.
 
  • Like
  • Fire
  • Wow
Reactions: 22 users
Of course a simple Google search revealed Qualcomm has it’s own Neural Software:

View attachment 27506

So I could be guilty of jumping the gun on this one and providing poorly researched information!

Sorry. :(


Edit: I might blame my pain meds!
Jeez. I put a massive order in just after close and now my internet is down, mobile phone and data is not working due to statewide brownout also landline is munted. 🤯
If you're wondering how I sent this post.... Keep wondering 😂
 
  • Haha
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Last edited:
  • Like
  • Fire
  • Love
Reactions: 34 users
BrainChip are partners with VVDN Technologies and VVDN Technologies and Qualcomm's qSmartAI80_CUQ610 AI vision kit was developed in partnership with VVDN Technologies.

Just saying...





It’s a tangled web. At least we know they are all aware of us through our partners. VVDN is a great company to be involved with!


1674112094205.png
 
  • Like
  • Fire
  • Love
Reactions: 23 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Hang on everyone! Put the cork back in the Bollingers.

I found this article from April 2022, which discusses Prophesee in collaboration with iCatch on a new event-based imaging sensor. And it repeats the same info as described in Qualcomm's presentation ans well as the slide I posted earlier. It states "From a hardware perspective, the new sensor appears to be very impressive, offering specs such as a >10k fps time-resolution equivalent, >120 dB dynamic range, and a 3nW/event power consumption. From a compute perspective, Metavision promises anywhere from 10 to 1000x less data than frame-rate-based solutions, with a throughput of >1000 obj/s, and a motion period irregularity detection of 1%."


Event-based Vision Sensor—“Metavison”—Promises Impressive Specs​

April 20, 2022 by Jake Hertz

PROPHESEE, along with iCatch, have teamed up to provide the industry with the "world's first event-based vision sensor" in a 13 x 15 mm mini PGBA package.​


As computer vision applications gain more and more momentum, companies are continually investing in new ways to improve the technology. At the same time, as pure imaging improves, power efficiency and data management become significant challenges on the hardware level.

An example computer vision block diagram.

An example computer vision block diagram. Image used courtesy of Cojbasic et al


One proposed solution to this challenge is ditching conventional imaging techniques in favor of event-based vision. Aiming to capitalize on this type of technology, this week, PROPHESEE, in collaboration with iCatch, released a new event-based vision sensor that boasts some impressive specs.
This article will discuss the concept of event-based vision, the benefits it offers, and dissect PROPHESEE’s newest offering.

Challenges in Conventional Vision​

One of the significant challenges in imaging systems is that, as imaging systems become conventionally better, they tend to put more stress on the hardware. Notably, as resolutions and field of view become better, the amount of raw data produced by the camera also increases.
While this may be a positive thing in terms of imaging quality, it creates a plethora of challenges for supporting hardware.

An example block diagram of a general image sensor.

An example block diagram of a general image sensor. Image used courtesy of Microsoft and LiKamWa et al


This increase in data traffic can have the harmful effect of placing an increased burden on computing resources, which now need to be able to process more data at faster speeds to maintain real-time operation. On top of this, conventional imaging systems work by applying the same frame rate to all objects in the scene. The result is that moving objects may end up being undersampled, and the important data in a scene can end up being lost.
When applied to machine learning, this increase in data traffic equals higher latency and more power consumption needed to complete a task. At the same time, much of the data being processed may not even be the essential information within a scene—further adding to the wasted energy and latency of the system.
These problems become even more concerning when coupled with the increasing demand for low power, low latency systems.

Solutions With Event-Based Vision?​

In an attempt to alleviate these issues, one promising solution is event-based vision.

Event-based vision (right) aims to remove redundant information from conventional vision (left) systems.

Event-based vision (right) aims to remove redundant information from conventional vision (left) systems. Image used courtesy of PROPHESEE


The concept of event-based vision rejects traditional frame-based imaging approaches, where every pixel reports back everything it sees at all times.
Instead, event-based sensing relies on each pixel to report what it sees only if it senses a significant change in its field of view. By only producing data when an event occurs, event-based sensing significantly reduces that raw amount of data created by imaging systems while also ensuring that the produced data is full of useful information.
Overall, the direct result type of sensing technology is that machine learning algorithms have to process less data, meaning less power consumption and lower latency overall.

The Metavision Sensor​

This week, PROPHESEE, in collaboration with iCatch, announced the release of its brand new event-based imaging sensor.
Dubbed the "Metavision sensor," the new IC leverages specialized pixels which only respond to changes in its field of view, activating themselves independently when triggered by events. While not an entirely novel technology, PROPHESEE claims that Metavision is the world's first event-based vision sensor available in an industry-standard package, coming in a 13 x 15 mm mini PGBA package.

The new Metavision sensor.

The new Metavision sensor. Image used courtesy of PROPHESEE


From a hardware perspective, the new sensor appears to be very impressive, offering specs such as a >10k fps time-resolution equivalent, >120 dB dynamic range, and a 3nW/event power consumption.
From a compute perspective, Metavision promises anywhere from 10 to 1000x less data than frame-rate-based solutions, with a throughput of >1000 obj/s, and a motion period irregularity detection of 1%.

Push for More Event-based Vision​

With Metavision, PROPHESEE and iCatch appear to have brought an exciting and promising new technology to an industry-standard format, making it more accessible for engineers everywhere.
Thanks to this, the companies are hopeful that event-based vision could start to permeate into the industry and bring its benefits along with it.
 
  • Like
  • Sad
  • Thinking
Reactions: 21 users
Top Bottom