BRN Discussion Ongoing

Tezza

Regular
  • Like
Reactions: 1 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Carnegie Mellon University developing a $20 million AI institute. Akida surely will be involved in numerous research projects.



Larry
 
  • Like
  • Fire
  • Love
Reactions: 52 users
Carnegie Mellon University developing a $20 million AI institute. Akida surely will be involved in numerous research projects.



Larry

That’s awesome. It’s what we need in Oz to drive AI innovation for beneficial AI here!
 
  • Like
  • Fire
Reactions: 20 users
Antonio's comments about ASX disclosure were interesting. We can all appreciate the lengthy engagement to adoption cycles.

But I feel he didn't really address why the partnerships can be mentioned publicly via social media and the website, but not also summarised in the Quarterly.

As it is public knowledge already, there should be no issue with re-affirming the same material in a quarterly report. They are not over-committing what they have achieved, only summarising what they've already made public.
 
  • Like
  • Fire
  • Love
Reactions: 9 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
Is Qualcomm really doing this without us????? What's their magic sauce????




Qualcomm exec: AI adoption becoming 'seminal moment' in tech

Scroll back up to restore default view.
Yahoo Finance
Yahoo Finance
Wed, 24 May 2023 at 7:14 am GMT+10


Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.

Video transcript​

- Well, Qualcomm is building on its partnership with Microsoft today, unveiling new features to scale on-device AI. That announcement at Microsoft's Build Developers Conference means further integration of Qualcomm's Snapdragon in Windows devices. Qualcomm saying in a statement, quote, "On-device generative AI solutions allow developers and cloud service providers to make generative AI more affordable, reliable, and private by moving queries and inferences to edge devices, including PCs and phones."
Joining us to discuss is Ziad Asghar, Qualcomm's senior vice president of product management. We've also got our very own Ali Garfinkel joining in on the conversation. Ziad good to talk to you. The announcement today really does build on an existing partnership you've already had with Microsoft. Talk to me about the scale of the opportunity you see in this partnership centered around AI with Microsoft.
ZIAD ASGHAR: I think this generative AI opportunity is just amazing for Qualcomm. You have to understand that this is how we're able to bring all of these experiences onto the devices that are in people's hands. Right now, many of these generative experiences are sitting in the cloud, so people, at large, are not able to access these experiences.
But now whether you think about productivity applications on PC-like devices or other entertainment-like applications on smartphones, with this change that's coming right now, with the technology that we have been building, we can actually bring all these experiences to the edge devices. To your PCs, to XR products, to smartphones, to IoT products, and that's really the amazing opportunity for Qualcomm. And we believe we have some unique differentiated technology that sets us apart from everybody else in this space.

ALEXANDRA GARFINKLE: Ziad, Ali here. Investors aren't necessarily thinking of Qualcomm as an AI stock, but should they? How big of an opportunity is Qualcomm looking at in AI?
ZIAD ASGHAR: Oh, absolutely, right. So think about it. If you think from the perspective of privacy, if you think from the perspective of the contextual data that's available on a device, to be able to run these generative AI use cases, large language models on the device, it completely changes the story. It allows people to be able to run these devices while keeping those queries private.
To be able to generate that experience even when you are not connected. To be able to use the contextual information in a way that, basically, you can give a far better experience to the user than is possible today. So let's take an example, right. So if you are talking about, you're creating a virtual world around you. If you already know the kind of textures, the colors, the themes that you like, we can actually optimize that experience for you.
So the experience is better, the privacy is basically preserved, you have the experience, whether you're connected or not, and essentially, the experience is far better. So we really think this brings the generative AI opportunity to the edge and really move the center of gravity towards edge devices. So it's a massive opportunity for us at Qualcomm.
- Yeah, I've heard you say over and over about the experience here. It feels like we're just really at the tip of experiencing generative AI. What does that future look like? When you talk about user experience on your smartphone, what are we talking about?
ZIAD ASGHAR: Let's take an example, right. So on the smartphone right now, you take a picture, already, we are using AI to be able to enhance the quality of that image. But what we can do now is, you can literally talk into your smartphone and you can say, well, put me in front of the Eiffel Tower at sunset with the beautiful clouds in the distance. And we can go and run a text two image model that actually generates that background and puts you right in front of it, right. That sort of an experience. That sort of an interactive experience on devices is really truly only possible if you can do this on the device.
And what makes this possible is our special hardware. Our pedigree is about doing more processing at the lowest power possible, and that's precisely what we're able to do with AI too. And we can essentially do that on a device that's not plugged, whereas some of our other competitors run it on graphics engines that takes hundreds of Watts of power, we literally do the experiences with milliwatts of power.
And that's what takes it to the masses. That's what takes it to every user that uses a smartphone at this point in time.
ALEXANDRA GARFINKLE: Ziad, we've heard a lot about AI risks in the context, for instance, of ChatGPT, which kicked all this off. But I'd love to hear a little bit about how AI risks look from your vantage point.
ZIAD ASGHAR: Yeah, so I think one of the main concerns that people have with some of the generative AI examples is privacy. You have seen some of the news come out recently about certain information getting out and staying and residing on these cloud servers.
Well, with AI-- generative AI running on the device, that query, that question that you ask from a text or text-like example, stays precisely on the device. It never leaves it. So actually, we are able to address some of the concerns that people have. Of course, additionally, the models have continued to get better. That will actually help with doing a lot more, in terms of improving the experience and taking away some of the concerns that people have with these generative AI use cases.
And of course, what is also going to happen is that people will actually get to use these experiences. I think this is a seminal moment in our industry because it really changes the way we interact with our products, with our devices, and this is not one of those things that kind of follow the hype curve. This is actually an example where we have customers coming up to us today and saying, I want to run these use cases on the device. So this is very true, and this is where we believe that we'll be able to actually address a lot of the concerns that people have.
Another big, very quick, concern that people have is the cost, right. If you can imagine that these models are getting larger, the applications are growing, at the same time, a lot more people are using them, you will get to a point where you will not have enough resources in the cloud. So you actually need the edge to be able to do generative AI. And that's the unique opportunity for us.
ALEXANDRA GARFINKLE: So I'd love to take this a step further then and say, is it possible that the technology you're talking about developing at Qualcomm will make ChatGPT and apps like it more accessible and less costly?
ZIAD ASGHAR: Absolutely. Right, so what we can do today, what we showed at some of our recent conferences, was a $1 billion parameter model, what we call stable diffusion running entirely on the device. We showed it today at BuildWare, that same model is now running on a PC.
Now what we have a roadmap for is to be able to do up to $10 billion parameters models on the device. And you can envision some of the models that are out there today, text to text-like models. And we see on a daily basis, new modalities coming into it.
Text to music, text to audio, image to video. All of those-- many of those, actually, fit very well in this range that I'm talking about, which we believe can run comfortably on the device in a sustained fashion. And that's why it comes to everybody. It comes to anybody and everybody that has a smartphone, or a PC, or an XR device not just connected to the cloud at this point in time.
- Qualcomm's Ziad Asghar, good to talk to you today. Fascinating to think about all the applications for AI. Our thanks to Ali Garfinkel as well for joining in on the conversation.

 
Last edited:
  • Like
  • Thinking
  • Fire
Reactions: 44 users

SERA2g

Founding Member

I wouldn't say MVP.

Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.

Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.

Below is the outcome of that comparison.

In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.

It's not even close.

On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.

Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.

Over to you for some clarification @Realinfo

Perhaps I'm missing something?
1685427393358.png




1685425790064.png


https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf

1685425922036.png
 

Attachments

  • 1685425736816.png
    1685425736816.png
    145.3 KB · Views: 46
  • 1685427117425.png
    1685427117425.png
    46.2 KB · Views: 49
  • 1685427223857.png
    1685427223857.png
    50.9 KB · Views: 56
Last edited:
  • Like
  • Fire
  • Love
Reactions: 55 users

HopalongPetrovski

I'm Spartacus!
I wouldn't say MVP.

Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.

Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.

Below is the outcome of that comparison.

In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.

It's not even close.

On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.

Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.

Over to you for some clarification @Realinfo

Perhaps I'm missing something?
View attachment 37458



View attachment 37453

https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf

View attachment 37454
Nice fact checking there Compadre. 🤣
Good to have the truth of these matters revealed.
 
  • Like
  • Love
  • Fire
Reactions: 19 users

ndefries

Regular
I wouldn't say MVP.

Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.

Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.

Below is the outcome of that comparison.

In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.

It's not even close.

On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.

Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.

Over to you for some clarification @Realinfo

Perhaps I'm missing something?
View attachment 37458



View attachment 37453

https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf

View attachment 37454
I think he was just referring to the directors and not the key management
 

manny100

Regular
Arrived for the party late but either Sean or the 'numb nut' from the MF that wrote that article stating that BRN cannot compete with NVIDIAare a little confused.
PS: I am tipping its the 'numb nut'.
The vimeo video Investor presentation in April Sean covered Competition Analysis in 2 slides 12 and13) and a lot of explanation.
I have attached a screen shot of slide 13 which is a table of comparison. BRN ticks every box.
NVIDIA is mention in the bottom box under Deep Learning Accelerator (NVIDIA, others). This has only one box ticked - Standard ML Workflow.
NVIDIA only gets a mention with the others.
Clearly the 'numb nut' has not done any research or just been spoon fed by others maybe shorters.
NVIDIA is a great business and it excels at the Data Center level.
BRN is Edge only.
Sean said in a recent interview that Data Centers' and the Edge will basically work side by side. The Edge will be suitable for certain data and the Data Centers other Data. Ie, plenty of room for both.
Is it me imagining things or does MF seem to publish many Shorter friendly articles??
 

Attachments

  • 2023-05-30.png
    2023-05-30.png
    372.1 KB · Views: 91
  • Like
  • Fire
  • Love
Reactions: 29 users

Cardpro

Regular
I wouldn't say MVP.

Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.

Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.

Below is the outcome of that comparison.

In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.

It's not even close.

On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.

Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.

Over to you for some clarification @Realinfo

Perhaps I'm missing something?
View attachment 37458



View attachment 37453

https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf

View attachment 37454
I think you are missing the point... CBA is one of the largest financial institutions in the world... banking industry is heavily regulated and the responsibilities of the executives are in a different level... not to mention they are making record high profits... vs nothing for our company... lol

The point is, we have been paying very competitively but the work done by our executives are yet to be reflected / proven on our financial statements... if anything we can only rely on current share price which is a huge disappointment...

Many companies and their exec make lots of promises but fail hard, although I am not saying our one is, but market thinks otherwise... hopefully market is wrong and shorters will get fked like shorters of Nvidia lol
 
  • Like
  • Haha
  • Fire
Reactions: 7 users

SERA2g

Founding Member
I think he was just referring to the directors and not the key management
Ah, ok. I won't bother putting it into the same table format but you can see straight off the bat the NED's of CBA get substantially more cash than those on the BRN board.

Majority of the of the NED remuneration at BRN is share based payments which then put them into a similar rem bracket as the NED's at CBA. Perhaps that's Roger's key concern? If that's the case then fair enough and I respect Roger's position on that.

My position is that the difference between CBA and BRN is that BRN is in a completely different industry and stage of company development. BRN doesn't have the cash to pay NED's competitive salaries so it must supplement it's NED payments with equity to attract appropriate talent.

On talent, BRN is competing with Tesla, Meta, Amazon, Nvidia, ARM, Intel.. Some of the biggest companies in the world.

If you want cheap NED',s fuck it, I'll take the role on.

Give me 50K shares a year and I'll attend the meetings without a single complaint. No promises I'll contribute any value.... But that's what you'll get when you go cheap.

1685429019186.png
 
  • Like
  • Fire
  • Love
Reactions: 35 users

Cardpro

Regular
Arrived for the party late but either Sean or the 'numb nut' from the MF that wrote that article stating that BRN cannot compete with NVIDIAare a little confused.
PS: I am tipping its the 'numb nut'.
The vimeo video Investor presentation in April Sean covered Competition Analysis in 2 slides 12 and13) and a lot of explanation.
I have attached a screen shot of slide 13 which is a table of comparison. BRN ticks every box.
NVIDIA is mention in the bottom box under Deep Learning Accelerator (NVIDIA, others). This has only one box ticked - Standard ML Workflow.
NVIDIA only gets a mention with the others.
Clearly the 'numb nut' has not done any research or just been spoon fed by others maybe shorters.
NVIDIA is a great business and it excels at the Data Center level.
BRN is Edge only.
Sean said in a recent interview that Data Centers' and the Edge will basically work side by side. The Edge will be suitable for certain data and the Data Centers other Data. Ie, plenty of room for both.
Is it me imagining things or does MF seem to publish many Shorter friendly articles??
We should do an analysis on how their recommended stocks perform and write an article whenever it goes down and also include a sentence on why we recommend brn before they invest on their recommendation lol
 
  • Like
  • Fire
  • Haha
Reactions: 4 users

GDJR69

Regular
Is Qualcomm really doing this without us????? What's their magic sauce????




Qualcomm exec: AI adoption becoming 'seminal moment' in tech

Scroll back up to restore default view.
Yahoo Finance
Yahoo Finance
Wed, 24 May 2023 at 7:14 am GMT+10


Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.

Video transcript​

- Well, Qualcomm is building on its partnership with Microsoft today, unveiling new features to scale on-device AI. That announcement at Microsoft's Build Developers Conference means further integration of Qualcomm's Snapdragon in Windows devices. Qualcomm saying in a statement, quote, "On-device generative AI solutions allow developers and cloud service providers to make generative AI more affordable, reliable, and private by moving queries and inferences to edge devices, including PCs and phones."
Joining us to discuss is Ziad Asghar, Qualcomm's senior vice president of product management. We've also got our very own Ali Garfinkel joining in on the conversation. Ziad good to talk to you. The announcement today really does build on an existing partnership you've already had with Microsoft. Talk to me about the scale of the opportunity you see in this partnership centered around AI with Microsoft.
ZIAD ASGHAR: I think this generative AI opportunity is just amazing for Qualcomm. You have to understand that this is how we're able to bring all of these experiences onto the devices that are in people's hands. Right now, many of these generative experiences are sitting in the cloud, so people, at large, are not able to access these experiences.
But now whether you think about productivity applications on PC-like devices or other entertainment-like applications on smartphones, with this change that's coming right now, with the technology that we have been building, we can actually bring all these experiences to the edge devices. To your PCs, to XR products, to smartphones, to IoT products, and that's really the amazing opportunity for Qualcomm. And we believe we have some unique differentiated technology that sets us apart from everybody else in this space.

ALEXANDRA GARFINKLE: Ziad, Ali here. Investors aren't necessarily thinking of Qualcomm as an AI stock, but should they? How big of an opportunity is Qualcomm looking at in AI?
ZIAD ASGHAR: Oh, absolutely, right. So think about it. If you think from the perspective of privacy, if you think from the perspective of the contextual data that's available on a device, to be able to run these generative AI use cases, large language models on the device, it completely changes the story. It allows people to be able to run these devices while keeping those queries private.
To be able to generate that experience even when you are not connected. To be able to use the contextual information in a way that, basically, you can give a far better experience to the user than is possible today. So let's take an example, right. So if you are talking about, you're creating a virtual world around you. If you already know the kind of textures, the colors, the themes that you like, we can actually optimize that experience for you.
So the experience is better, the privacy is basically preserved, you have the experience, whether you're connected or not, and essentially, the experience is far better. So we really think this brings the generative AI opportunity to the edge and really move the center of gravity towards edge devices. So it's a massive opportunity for us at Qualcomm.
- Yeah, I've heard you say over and over about the experience here. It feels like we're just really at the tip of experiencing generative AI. What does that future look like? When you talk about user experience on your smartphone, what are we talking about?
ZIAD ASGHAR: Let's take an example, right. So on the smartphone right now, you take a picture, already, we are using AI to be able to enhance the quality of that image. But what we can do now is, you can literally talk into your smartphone and you can say, well, put me in front of the Eiffel Tower at sunset with the beautiful clouds in the distance. And we can go and run a text two image model that actually generates that background and puts you right in front of it, right. That sort of an experience. That sort of an interactive experience on devices is really truly only possible if you can do this on the device.
And what makes this possible is our special hardware. Our pedigree is about doing more processing at the lowest power possible, and that's precisely what we're able to do with AI too. And we can essentially do that on a device that's not plugged, whereas some of our other competitors run it on graphics engines that takes hundreds of Watts of power, we literally do the experiences with milliwatts of power.
And that's what takes it to the masses. That's what takes it to every user that uses a smartphone at this point in time.
ALEXANDRA GARFINKLE: Ziad, we've heard a lot about AI risks in the context, for instance, of ChatGPT, which kicked all this off. But I'd love to hear a little bit about how AI risks look from your vantage point.
ZIAD ASGHAR: Yeah, so I think one of the main concerns that people have with some of the generative AI examples is privacy. You have seen some of the news come out recently about certain information getting out and staying and residing on these cloud servers.
Well, with AI-- generative AI running on the device, that query, that question that you ask from a text or text-like example, stays precisely on the device. It never leaves it. So actually, we are able to address some of the concerns that people have. Of course, additionally, the models have continued to get better. That will actually help with doing a lot more, in terms of improving the experience and taking away some of the concerns that people have with these generative AI use cases.
And of course, what is also going to happen is that people will actually get to use these experiences. I think this is a seminal moment in our industry because it really changes the way we interact with our products, with our devices, and this is not one of those things that kind of follow the hype curve. This is actually an example where we have customers coming up to us today and saying, I want to run these use cases on the device. So this is very true, and this is where we believe that we'll be able to actually address a lot of the concerns that people have.
Another big, very quick, concern that people have is the cost, right. If you can imagine that these models are getting larger, the applications are growing, at the same time, a lot more people are using them, you will get to a point where you will not have enough resources in the cloud. So you actually need the edge to be able to do generative AI. And that's the unique opportunity for us.
ALEXANDRA GARFINKLE: So I'd love to take this a step further then and say, is it possible that the technology you're talking about developing at Qualcomm will make ChatGPT and apps like it more accessible and less costly?
ZIAD ASGHAR: Absolutely. Right, so what we can do today, what we showed at some of our recent conferences, was a $1 billion parameter model, what we call stable diffusion running entirely on the device. We showed it today at BuildWare, that same model is now running on a PC.
Now what we have a roadmap for is to be able to do up to $10 billion parameters models on the device. And you can envision some of the models that are out there today, text to text-like models. And we see on a daily basis, new modalities coming into it.
Text to music, text to audio, image to video. All of those-- many of those, actually, fit very well in this range that I'm talking about, which we believe can run comfortably on the device in a sustained fashion. And that's why it comes to everybody. It comes to anybody and everybody that has a smartphone, or a PC, or an XR device not just connected to the cloud at this point in time.
- Qualcomm's Ziad Asghar, good to talk to you today. Fascinating to think about all the applications for AI. Our thanks to Ali Garfinkel as well for joining in on the conversation.

Well either suddenly everyone has invented their own Akida without breaching our patents or they are using Akida just as it has become commercially available . . . ;) I mean if Renesas, Megachips and ARM are all on board why not Qualcomm too? Strange though that we haven't even had a partnership announcement, but then again we'd heard nothing about Mercedes Benz either until Mercedes themselves declared they were going to use Akida, so who knows?
 
  • Like
  • Thinking
  • Fire
Reactions: 33 users

SERA2g

Founding Member
We should do an analysis on how their recommended stocks perform and write an article whenever it goes down and also include a sentence on why we recommend brn before they invest on their recommendation lol
I'm pretty confident I've read a MF review where someone subscribed to MF for a year and put cash into each recommendation they made throughout that year.

The result was a loss across the 12 month period haha.
 
  • Like
  • Haha
  • Fire
Reactions: 22 users
Is Qualcomm really doing this without us????? What's their magic sauce????




Qualcomm exec: AI adoption becoming 'seminal moment' in tech

Scroll back up to restore default view.
Yahoo Finance
Yahoo Finance
Wed, 24 May 2023 at 7:14 am GMT+10


Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.

Video transcript​

- Well, Qualcomm is building on its partnership with Microsoft today, unveiling new features to scale on-device AI. That announcement at Microsoft's Build Developers Conference means further integration of Qualcomm's Snapdragon in Windows devices. Qualcomm saying in a statement, quote, "On-device generative AI solutions allow developers and cloud service providers to make generative AI more affordable, reliable, and private by moving queries and inferences to edge devices, including PCs and phones."
Joining us to discuss is Ziad Asghar, Qualcomm's senior vice president of product management. We've also got our very own Ali Garfinkel joining in on the conversation. Ziad good to talk to you. The announcement today really does build on an existing partnership you've already had with Microsoft. Talk to me about the scale of the opportunity you see in this partnership centered around AI with Microsoft.
ZIAD ASGHAR: I think this generative AI opportunity is just amazing for Qualcomm. You have to understand that this is how we're able to bring all of these experiences onto the devices that are in people's hands. Right now, many of these generative experiences are sitting in the cloud, so people, at large, are not able to access these experiences.
But now whether you think about productivity applications on PC-like devices or other entertainment-like applications on smartphones, with this change that's coming right now, with the technology that we have been building, we can actually bring all these experiences to the edge devices. To your PCs, to XR products, to smartphones, to IoT products, and that's really the amazing opportunity for Qualcomm. And we believe we have some unique differentiated technology that sets us apart from everybody else in this space.

ALEXANDRA GARFINKLE: Ziad, Ali here. Investors aren't necessarily thinking of Qualcomm as an AI stock, but should they? How big of an opportunity is Qualcomm looking at in AI?
ZIAD ASGHAR: Oh, absolutely, right. So think about it. If you think from the perspective of privacy, if you think from the perspective of the contextual data that's available on a device, to be able to run these generative AI use cases, large language models on the device, it completely changes the story. It allows people to be able to run these devices while keeping those queries private.
To be able to generate that experience even when you are not connected. To be able to use the contextual information in a way that, basically, you can give a far better experience to the user than is possible today. So let's take an example, right. So if you are talking about, you're creating a virtual world around you. If you already know the kind of textures, the colors, the themes that you like, we can actually optimize that experience for you.
So the experience is better, the privacy is basically preserved, you have the experience, whether you're connected or not, and essentially, the experience is far better. So we really think this brings the generative AI opportunity to the edge and really move the center of gravity towards edge devices. So it's a massive opportunity for us at Qualcomm.
- Yeah, I've heard you say over and over about the experience here. It feels like we're just really at the tip of experiencing generative AI. What does that future look like? When you talk about user experience on your smartphone, what are we talking about?
ZIAD ASGHAR: Let's take an example, right. So on the smartphone right now, you take a picture, already, we are using AI to be able to enhance the quality of that image. But what we can do now is, you can literally talk into your smartphone and you can say, well, put me in front of the Eiffel Tower at sunset with the beautiful clouds in the distance. And we can go and run a text two image model that actually generates that background and puts you right in front of it, right. That sort of an experience. That sort of an interactive experience on devices is really truly only possible if you can do this on the device.
And what makes this possible is our special hardware. Our pedigree is about doing more processing at the lowest power possible, and that's precisely what we're able to do with AI too. And we can essentially do that on a device that's not plugged, whereas some of our other competitors run it on graphics engines that takes hundreds of Watts of power, we literally do the experiences with milliwatts of power.
And that's what takes it to the masses. That's what takes it to every user that uses a smartphone at this point in time.
ALEXANDRA GARFINKLE: Ziad, we've heard a lot about AI risks in the context, for instance, of ChatGPT, which kicked all this off. But I'd love to hear a little bit about how AI risks look from your vantage point.
ZIAD ASGHAR: Yeah, so I think one of the main concerns that people have with some of the generative AI examples is privacy. You have seen some of the news come out recently about certain information getting out and staying and residing on these cloud servers.
Well, with AI-- generative AI running on the device, that query, that question that you ask from a text or text-like example, stays precisely on the device. It never leaves it. So actually, we are able to address some of the concerns that people have. Of course, additionally, the models have continued to get better. That will actually help with doing a lot more, in terms of improving the experience and taking away some of the concerns that people have with these generative AI use cases.
And of course, what is also going to happen is that people will actually get to use these experiences. I think this is a seminal moment in our industry because it really changes the way we interact with our products, with our devices, and this is not one of those things that kind of follow the hype curve. This is actually an example where we have customers coming up to us today and saying, I want to run these use cases on the device. So this is very true, and this is where we believe that we'll be able to actually address a lot of the concerns that people have.
Another big, very quick, concern that people have is the cost, right. If you can imagine that these models are getting larger, the applications are growing, at the same time, a lot more people are using them, you will get to a point where you will not have enough resources in the cloud. So you actually need the edge to be able to do generative AI. And that's the unique opportunity for us.
ALEXANDRA GARFINKLE: So I'd love to take this a step further then and say, is it possible that the technology you're talking about developing at Qualcomm will make ChatGPT and apps like it more accessible and less costly?
ZIAD ASGHAR: Absolutely. Right, so what we can do today, what we showed at some of our recent conferences, was a $1 billion parameter model, what we call stable diffusion running entirely on the device. We showed it today at BuildWare, that same model is now running on a PC.
Now what we have a roadmap for is to be able to do up to $10 billion parameters models on the device. And you can envision some of the models that are out there today, text to text-like models. And we see on a daily basis, new modalities coming into it.
Text to music, text to audio, image to video. All of those-- many of those, actually, fit very well in this range that I'm talking about, which we believe can run comfortably on the device in a sustained fashion. And that's why it comes to everybody. It comes to anybody and everybody that has a smartphone, or a PC, or an XR device not just connected to the cloud at this point in time.
- Qualcomm's Ziad Asghar, good to talk to you today. Fascinating to think about all the applications for AI. Our thanks to Ali Garfinkel as well for joining in on the conversation.


Very interesting @Bravo

They’re going to be either a big competitor or a great partner. They are definitely aiming for the same edge space…. I liked the Stable Diffusion comment too because I‘m sure I heard Rob T mention that at one time. I hope Prophesee helped push our technology benefits.

Given PVDM etc are still saying we have a 3-5 year lead on our competitors my sales pitch would be:

“Whoever integrates us first becomes an edge AI market leader, is it going to be you or your competitor?”

:)
 
  • Like
  • Fire
  • Love
Reactions: 29 users

AusEire

Founding Member. It's ok to say No to Dot Joining
I think you are missing the point... CBA is one of the largest financial institutions in the world... banking industry is heavily regulated and the responsibilities of the executives are in a different level... not to mention they are making record high profits... vs nothing for our company... lol

The point is, we have been paying very competitively but the work done by our executives are yet to be reflected / proven on our financial statements... if anything we can only rely on current share price which is a huge disappointment...

Many companies and their exec make lots of promises but fail hard, although I am not saying our one is, but market thinks otherwise... hopefully market is wrong and shorters will get fked like shorters of Nvidia lol
Missed the point?

The point was that Brainchips Management/Directors were earning more than their counterparts in CBA and BHP.

That point has been proven false by @SERA2g above.

You're now trying to spin it to sound like something else 😂 I've got 2 words for ya mate but I'll probably get a warning from Dreadd for it. So I'll leave it up to your imagination to figure it out.
 
  • Like
  • Fire
  • Love
Reactions: 28 users

Bravo

If ARM was an arm, BRN would be its biceps💪!
For some reason I thought that Qualcomm shelved Zeroth but I was just flipping through this Tata white paper from Feb 2023 and it says "other processors that are likely to come out within the next couple of years, such as Zeroth, Akida etc., cater to edge applications".

So, obviously, according to Tata at least, they didn't stop working on Zeroth but I can't find any recent information on it to see how it compares with our tech.

Screen Shot 2023-05-30 at 5.15.26 pm.png

Screen Shot 2023-05-30 at 5.15.40 pm.png




Screen Shot 2023-05-30 at 5.24.01 pm.png




 
  • Like
  • Love
  • Fire
Reactions: 17 users

SERA2g

Founding Member
Missed the point?

The point was that Brainchips Management/Directors were earning more than their counterparts in CBA and BHP.

That point has been proven false by @SERA2g above.

You're now trying to spin it to sound like something else 😂 I've got 2 words for ya mate but I'll probably get a warning from Dreadd for it. So I'll leave it up to your imagination to figure it out.
He wasn't entirely wrong if he's specifically talking about non-executive directors, the CBA and BRN payments to NED's are very similar in total. The difference between the two being CBA NED's are paid in primarily cash where as the BRN NED's are paid primarily equity. Very common for start ups/pre-revenue companies.

The then argument is how much should they be paid, and again, comparing CBA to BRN is a little difficult given they're in entirely separate industry's and so its a case of apples to oranges when comparing the two.

In spite of the above, Roger's speech did say "directors".

It was therefore a little misleading as I, and I can only assume most shareholders, assumed he meant all of our directors, not just our NED's.
 
  • Like
  • Fire
Reactions: 17 users

Cardpro

Regular
Missed the point?

The point was that Brainchips Management/Directors were earning more than their counterparts in CBA and BHP.

That point has been proven false by @SERA2g above.

You're now trying to spin it to sound like something else 😂 I've got 2 words for ya mate but I'll probably get a warning from Dreadd for it. So I'll leave it up to your imagination to figure it out
The point was that the management / directors gets paid shitload and to prove that point he compared it with executives from large corps which was incorrect. But I think point is still valid IMO.
 
  • Like
Reactions: 5 users
Top Bottom