BRN Discussion Ongoing

IloveLamp

Top 20
1000013711.jpg
1000013715.jpg
 
  • Like
  • Fire
  • Love
Reactions: 27 users

mrgds

Regular
Screenshot (85).png
 
  • Haha
  • Fire
Reactions: 7 users

FJ-215

Regular
Maybe this is the deal Sean was referring to that didn’t go through is my thinking
Maybe, don't think we will ever know who it was. Although Prophesee already had a working solution with their current tech partner so could be one of our early adopters.
 
  • Like
Reactions: 5 users

IloveLamp

Top 20
  • Fire
  • Like
  • Wow
Reactions: 3 users

IloveLamp

Top 20
Ready for open

1000013718.gif
 
  • Haha
  • Love
Reactions: 8 users
Maybe, don't think we will ever know who it was. Although Prophesee already had a working solution with their current tech partner so could be one of our early adopters.
Fingers crossed that is true, I am sure the discussion was based around blurred images that BRN helped resolve
 
  • Like
  • Thinking
Reactions: 3 users
Without a signed IP contract with us how can Prophesee be using our IP in their products? Is it correct to say that that we cannot expect any royalties from any partner if they havent signed a IP contract? @Fact Finder could you please shed some light on this? Sean also said in the webinar that it was a high priority for the sales team to get more IP contracts. I hope it is possible that our IP can be used by partners without an IP agreement. Then hopefully revenue will appear suddenly. I just dont understand how anyone can pay us without any formally signed contract.
Hi Cand2it

I agree with the other comments. Where I think your confusion may have begun is the fact that Brainchip has an interesting interpretation of the ASX Continuous Disclosure Rules.

Their interpretation is that unless they can calculate a precise dollar amount on an agreement they cannot announce an engagement on the ASX.

So starting from this point and taking Tata Elxsi as an example it is most certainly the case there is in place a written document that contains the terms under which Brainchip and Tata Elxsi engage.

I don’t know what these terms are but it logically would cover the release of the IP and any joint intellectual property development.

I would also suggest it would cover the financial terms otherwise how would Tata Elxsi be able to approach a medical or industrial customer to sell an AKIDA based product without knowing what it would cost the customer.

This agreement/contract/NDA or whatever name one wants to apply not being for a defined sum of money ie one million units at $25.00 each, will not be an agreement based on Brainchip’s interpretation of the ASX Rules that can be announced on the ASX.

I personally believe there are many such agreements of this type in place between Brainchip and its many partners.

My opinion only DYOR
Fact Finder
 
  • Like
  • Love
  • Thinking
Reactions: 44 users
Maybe this is the deal Sean was referring to that didn’t go through is my thinking
No.

The deal did not proceed because (Sean’s words) the company laid off thousands of employees.

My best bet is Ericsson as they laid off thousands of employees about the right time.


At least some of researchers in the Ericsson paper I posted worked at the North American branch and still do and have published since these lay offs. Their paper relates to the area of commercial operations targeted in the lay offs and appear on LinkedIn to be still employed up to a couple of weeks ago:


Towards 6G Zero-Energy Internet of Things: Standards, Trends, and Recent Results

T Khan, SNK Veedu, A Rácz, M Afshang… - Authorea …, 2023 - techrxiv.org
… , a neuromorphic AI chip (the Akida neural chip from BrainChip), … @ericsson.com) is a senior researcher at Ericsson Research in … She joined Ericsson in 2018 following her postdoctoral


I believe Sean Heir has subsequently said that the project is not dead. I note also that Nadan Nayampally has mentioned energy harvesting as a use case for AKIDA.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 29 users

Bloodsy

Regular
Shared on ASX bets, not sure of the validity of it, but interestingly this was allegedly posted and quickly deleted.

Wonder if they could be toying with AKIDA? who was it playing with neuromorphic SDRs? Always pricks the ears (or eyes) when we see and hear "without referencing the cloud"


1709163060718.png
 
  • Like
  • Fire
  • Love
Reactions: 15 users
No.

The deal did not proceed because (Sean’s words) the company laid off thousands of employees.

My best bet is Ericsson as they laid off thousands of employees about the right time.


At least some of researchers in the Ericsson paper I posted worked at the North American branch and still do and have published since these lay offs. Their paper relates to the area of commercial operations targeted in the lay offs and appear on LinkedIn to be still employed up to a couple of weeks ago:


Towards 6G Zero-Energy Internet of Things: Standards, Trends, and Recent Results

T Khan, SNK Veedu, A Rácz, M Afshang… - Authorea …, 2023 - techrxiv.org
… , a neuromorphic AI chip (the Akida neural chip from BrainChip), … @ericsson.com) is a senior researcher at Ericsson Research in … She joined Ericsson in 2018 following her postdoctoral


I believe Sean Heir has subsequently said that the project is not dead. I note also that Nadan Nayampally has mentioned energy harvesting as a use case for AKIDA.

My opinion only DYOR
Fact Finder
My wild theory for the day.

What if Ericsson, Brainchip and Prophesee were working together to substitute Prophesee’s event based neuromorphic vision sensor for the camera used in the above paper.

My technophobe understanding is this would further reduce power consumption making it a World beating best fit for being powered by energy harvesting from normal internal light sources.

My opinion only DYOR
Fact Finder
 
  • Like
  • Love
  • Thinking
Reactions: 24 users

charles2

Regular
Hi @charles2

The only things I know about Prophesee and it’s neuromorphic sensor technology are:

1. It can work with other technology including Von Neumann;

2. It partnered with Brainchip because AKIDA technology natively runs SNN and as such is directly compatible with Prophesee’s sensor and there is no loss of functionality as there is when teamed with other technology such as that from SynSense, Intel, Qualcomm and Sony.

3. Brainchip and Prophesee were demonstrating their joint technology at CES 2024.

4. In my opinion Brainchip is not part of the current Prophesee product offerings from SynSense, Qualcomm and/or Sony.

My opinion only DYOR
Fact Finder
Likely then that Akida is the "killer app" in Prophesee's next iteration. That is an easy concept to keep us grounded ....as prophesy relates to future events.

Thanks FF for clarifying.
 
  • Like
  • Haha
Reactions: 6 users

AARONASX

Holding onto what I've got
Shared on ASX bets, not sure of the validity of it, but interestingly this was allegedly posted and quickly deleted.

Wonder if they could be toying with AKIDA? who was it playing with neuromorphic SDRs? Always pricks the ears (or eyes) when we see and hear "without referencing the cloud"


View attachment 58149

nice find!...I put these guys DRO on my watch list the other day, commsec data says they are overvalued at the moment, maybe one to watch as Akida would benefit their tech... might pick a few in a month or two.
 
  • Like
Reactions: 8 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
A MUST WATCH INTERVIEW IMO!!!


Ziad Asghar, Qualcomm | MWC Barcelona 2024 - YouTube

5 hours ago





Sustainability and AI: Qualcomm aims to reshape the mobile landscape​



The future of artificial intelligence is not confined to boardrooms and laboratories — it’s unfolding in real-time, shaping the way we live, work and interact with the world around us. The journey is one marked by ingenuity, collaboration and a steadfast commitment to harnessing the transformative power of AI for improved sustainability and the betterment of humanity.
Diving deep into the implications of AI’s migration to the edge, Ziad Asghar (pictured), senior vice president of product management and head of AI Technology at Qualcomm Technologies Inc., gave a glimpse into the transformative power of on-device processing, the burgeoning realm of sustainability and the democratization of AI.
“Generative AI may have started on the cloud, but it’s transitioning to the edge, it’s coming to the device and that brings in with it [an] amazing amount of benefits,” Asghar said.
Asghar spoke with theCUBE Research analysts John Furrier, Dave Vellante and Shelly Kramer at MWC Barcelona, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the evolving landscape of AI and its integration into everyday devices. (* Disclosure below.)

The future of AI: A move toward efficiency and sustainability​

AI’s trajectory is swiftly shifting toward on-device processing, marking a paradigm shift in how we interact with technology. Gone are the days of heavy reliance on cloud infrastructure; the future lies in the empowerment of devices to perform complex AI tasks locally. The cornerstone of this shift is the evolution of AI models — becoming not only more capable but also more compact, according to Asghar.
“That allows us to be able to do more AI processing than anybody else out there from a performance per watt perspective,” he said.
Qualcomm’s unveiling of Stable Diffusion, a text-to-image model running on a device with a staggering seven billion parameters, underscores the strides made in on-device AI processing. This progress not only enhances efficiency but also aligns with sustainability goals by significantly reducing energy consumption. The age-old debate of proprietary versus open-source models is also addressed, with Qualcomm embracing the democratization of AI through initiatives, such as the AI Hub, empowering developers to leverage curated models for diverse applications.
“People can take these generative AI model like Stable Diffusion, we can pick it up from there and readily create an application,” Asghar said. “And you know what I’m excited about is there are many applications that we might not even thought about when we were designing these chips. But I hope that developers are able to come up with those amazing ideas.”

The shift toward edge AI​

Amid the excitement, challenges loom on the horizon. The optimization of power consumption, interoperability of AI models and maintaining a delicate balance between innovation and ethical considerations are just a few hurdles to overcome.
These challenges are opportunities in disguise — opportunities to push boundaries, solve meaningful problems and shape a future where human-machine interaction transcends boundaries, according to Asghar.
“What if you have a virtual assistant sitting on your device that you say, ‘Reserve a restaurant for me tomorrow,’” he said. “It basically figures out your calendar, it figures out Yelp, it finds a place close to you, goes to open table, makes the reservation for you and you’re done. This is the kind of human-machine interaction sign up that I think that we can enable in the future.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of MWC Barcelona:

 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 18 users

ndefries

Regular
Likely then that Akida is the "killer app" in Prophesee's next iteration. That is an easy concept to keep us grounded ....as prophesy relates to future events.

Thanks FF for clarifying.
Prophesee Sony and Akida combination is going to take the smart phone to a new level. I just wish it can hurry up already.
 
  • Like
  • Fire
  • Thinking
Reactions: 11 users

rgupta

Regular
Hi @charles2

The only things I know about Prophesee and it’s neuromorphic sensor technology are:

1. It can work with other technology including Von Neumann;

2. It partnered with Brainchip because AKIDA technology natively runs SNN and as such is directly compatible with Prophesee’s sensor and there is no loss of functionality as there is when teamed with other technology such as that from SynSense, Intel, Qualcomm and Sony.

3. Brainchip and Prophesee were demonstrating their joint technology at CES 2024.

4. In my opinion Brainchip is not part of the current Prophesee product offerings from SynSense, Qualcomm and/or Sony.

My opinion only DYOR
Fact Finder
Just one question for point 4 why Qualcomm has to wait until prophese partnered with brn before declaring their partnership with prophese?
They must be working on product before hand.
 
  • Like
Reactions: 4 users
Just one question for point 4 why Qualcomm has to wait until prophese partnered with brn before declaring their partnership with prophese?
They must be working on product before hand.
Hi rgupta

I don’t know the answer. Just like the share price Qualcomm will do what Qualcomm will do.

If I was running Qualcomm building an anti blur mobile phone camera using a product from Prophesee without an exclusive license I would announce as late as possible to make it hard for Apple to catch up. I would probably wait until the first new product launch at least.

My opinion only DYOR
Fact Finder
 
  • Like
  • Fire
  • Love
Reactions: 21 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
A MUST WATCH INTERVIEW IMO!!!


Ziad Asghar, Qualcomm | MWC Barcelona 2024 - YouTube

5 hours ago





Sustainability and AI: Qualcomm aims to reshape the mobile landscape​



The future of artificial intelligence is not confined to boardrooms and laboratories — it’s unfolding in real-time, shaping the way we live, work and interact with the world around us. The journey is one marked by ingenuity, collaboration and a steadfast commitment to harnessing the transformative power of AI for improved sustainability and the betterment of humanity.
Diving deep into the implications of AI’s migration to the edge, Ziad Asghar (pictured), senior vice president of product management and head of AI Technology at Qualcomm Technologies Inc., gave a glimpse into the transformative power of on-device processing, the burgeoning realm of sustainability and the democratization of AI.
“Generative AI may have started on the cloud, but it’s transitioning to the edge, it’s coming to the device and that brings in with it [an] amazing amount of benefits,” Asghar said.
Asghar spoke with theCUBE Research analysts John Furrier, Dave Vellante and Shelly Kramer at MWC Barcelona, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the evolving landscape of AI and its integration into everyday devices. (* Disclosure below.)

The future of AI: A move toward efficiency and sustainability​

AI’s trajectory is swiftly shifting toward on-device processing, marking a paradigm shift in how we interact with technology. Gone are the days of heavy reliance on cloud infrastructure; the future lies in the empowerment of devices to perform complex AI tasks locally. The cornerstone of this shift is the evolution of AI models — becoming not only more capable but also more compact, according to Asghar.
“That allows us to be able to do more AI processing than anybody else out there from a performance per watt perspective,” he said.
Qualcomm’s unveiling of Stable Diffusion, a text-to-image model running on a device with a staggering seven billion parameters, underscores the strides made in on-device AI processing. This progress not only enhances efficiency but also aligns with sustainability goals by significantly reducing energy consumption. The age-old debate of proprietary versus open-source models is also addressed, with Qualcomm embracing the democratization of AI through initiatives, such as the AI Hub, empowering developers to leverage curated models for diverse applications.
“People can take these generative AI model like Stable Diffusion, we can pick it up from there and readily create an application,” Asghar said. “And you know what I’m excited about is there are many applications that we might not even thought about when we were designing these chips. But I hope that developers are able to come up with those amazing ideas.”

The shift toward edge AI​

Amid the excitement, challenges loom on the horizon. The optimization of power consumption, interoperability of AI models and maintaining a delicate balance between innovation and ethical considerations are just a few hurdles to overcome.
These challenges are opportunities in disguise — opportunities to push boundaries, solve meaningful problems and shape a future where human-machine interaction transcends boundaries, according to Asghar.
“What if you have a virtual assistant sitting on your device that you say, ‘Reserve a restaurant for me tomorrow,’” he said. “It basically figures out your calendar, it figures out Yelp, it finds a place close to you, goes to open table, makes the reservation for you and you’re done. This is the kind of human-machine interaction sign up that I think that we can enable in the future.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE Research’s coverage of MWC Barcelona:


At 20.30 Ziad Ashgar talks about future use cases in the medical arena where the AI at the edge will be able to predict things before they happen. I didn't think that Qualcomm's NPU had predictive capabilities??....

Also, it sounds like Ziad Ashgar took notes from Peter van der Made's "4 Bit are Enough" paper. At 22.50 mins -In this segment Ziad discusses quantization and how they pre-empted LLM's to run on 4 bit integer.

Ziad also says they are preparing for their next gen which will have a lot more techniques, models, capabilities and use cases ("hello" AKIDA😝). He says they're just getting started.
 
Last edited:
  • Like
  • Fire
  • Love
Reactions: 25 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Does anyone else reckon that Competitor 1 on the slide is Qualcomm?

Surely they're not going to bypass the opportunity to add "on-chip learning" to their suite of capabilities, particularly if they want to rule the edge.


Screenshot 2024-02-29 at 11.53.45 am.png
 
  • Like
  • Haha
  • Fire
Reactions: 17 users
Top Bottom