Hard to read the sp atm
Hard to read the sp atm
Carnegie Mellon University developing a $20 million AI institute. Akida surely will be involved in numerous research projects.
Carnegie Mellon to receive $20 million for artificial intelligence institute
The future of artificial intelligence in public safety could unfold at Carnegie Mellon University. The Pittsburgh school announced this month it will lead...www.post-gazette.com
Larry
MVP Roger after 36 min
Nice fact checking there Compadre.I wouldn't say MVP.
Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.
Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.
Below is the outcome of that comparison.
In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.
It's not even close.
On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.
Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.
Over to you for some clarification @Realinfo
Perhaps I'm missing something?
View attachment 37458
View attachment 37453
https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf
View attachment 37454
I think he was just referring to the directors and not the key managementI wouldn't say MVP.
Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.
Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.
Below is the outcome of that comparison.
In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.
It's not even close.
On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.
Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.
Over to you for some clarification @Realinfo
Perhaps I'm missing something?
View attachment 37458
View attachment 37453
https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf
View attachment 37454
I think you are missing the point... CBA is one of the largest financial institutions in the world... banking industry is heavily regulated and the responsibilities of the executives are in a different level... not to mention they are making record high profits... vs nothing for our company... lolI wouldn't say MVP.
Whilst Roger's speech was spirited, it was not accurate which is a bit of shame given it was long-winded, soaked up valuable Q&A time and possibly had an impact on the way some shareholders voted with respect to the remuneration report.
Out of curiosity, given the statements Roger made, I have reviewed CBA's remuneration report and have then compared it to BRN's.
Below is the outcome of that comparison.
In summary, CBA KMP's are paid SUBSTANTIALLY more in both cash and equity than Brainchip's KMP's.
It's not even close.
On an average across the board, CBA KMP's get 4x the cash, 2x the equity and 2.7x in total remuneration.
Roger also said the BRN directors were paid more than BHP's, but I haven't bothered to check that given how wrong he was with CBA.
Over to you for some clarification @Realinfo
Perhaps I'm missing something?
View attachment 37458
View attachment 37453
https://www.commbank.com.au/content...out-us/2022-08/2022-annual-report_spreads.pdf
View attachment 37454
Ah, ok. I won't bother putting it into the same table format but you can see straight off the bat the NED's of CBA get substantially more cash than those on the BRN board.I think he was just referring to the directors and not the key management
We should do an analysis on how their recommended stocks perform and write an article whenever it goes down and also include a sentence on why we recommend brn before they invest on their recommendation lolArrived for the party late but either Sean or the 'numb nut' from the MF that wrote that article stating that BRN cannot compete with NVIDIAare a little confused.
PS: I am tipping its the 'numb nut'.
The vimeo video Investor presentation in April Sean covered Competition Analysis in 2 slides 12 and13) and a lot of explanation.
I have attached a screen shot of slide 13 which is a table of comparison. BRN ticks every box.
NVIDIA is mention in the bottom box under Deep Learning Accelerator (NVIDIA, others). This has only one box ticked - Standard ML Workflow.
NVIDIA only gets a mention with the others.
Clearly the 'numb nut' has not done any research or just been spoon fed by others maybe shorters.
NVIDIA is a great business and it excels at the Data Center level.
BRN is Edge only.
Sean said in a recent interview that Data Centers' and the Edge will basically work side by side. The Edge will be suitable for certain data and the Data Centers other Data. Ie, plenty of room for both.
Is it me imagining things or does MF seem to publish many Shorter friendly articles??
Well either suddenly everyone has invented their own Akida without breaching our patents or they are using Akida just as it has become commercially available . . . I mean if Renesas, Megachips and ARM are all on board why not Qualcomm too? Strange though that we haven't even had a partnership announcement, but then again we'd heard nothing about Mercedes Benz either until Mercedes themselves declared they were going to use Akida, so who knows?Is Qualcomm really doing this without us????? What's their magic sauce????
Scroll back up to restore default view.
Yahoo Finance
Wed, 24 May 2023 at 7:14 am GMT+10
Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.
Video transcript
- Well, Qualcomm is building on its partnership with Microsoft today, unveiling new features to scale on-device AI. That announcement at Microsoft's Build Developers Conference means further integration of Qualcomm's Snapdragon in Windows devices. Qualcomm saying in a statement, quote, "On-device generative AI solutions allow developers and cloud service providers to make generative AI more affordable, reliable, and private by moving queries and inferences to edge devices, including PCs and phones."
Joining us to discuss is Ziad Asghar, Qualcomm's senior vice president of product management. We've also got our very own Ali Garfinkel joining in on the conversation. Ziad good to talk to you. The announcement today really does build on an existing partnership you've already had with Microsoft. Talk to me about the scale of the opportunity you see in this partnership centered around AI with Microsoft.
ZIAD ASGHAR: I think this generative AI opportunity is just amazing for Qualcomm. You have to understand that this is how we're able to bring all of these experiences onto the devices that are in people's hands. Right now, many of these generative experiences are sitting in the cloud, so people, at large, are not able to access these experiences.
But now whether you think about productivity applications on PC-like devices or other entertainment-like applications on smartphones, with this change that's coming right now, with the technology that we have been building, we can actually bring all these experiences to the edge devices. To your PCs, to XR products, to smartphones, to IoT products, and that's really the amazing opportunity for Qualcomm. And we believe we have some unique differentiated technology that sets us apart from everybody else in this space.
ALEXANDRA GARFINKLE: Ziad, Ali here. Investors aren't necessarily thinking of Qualcomm as an AI stock, but should they? How big of an opportunity is Qualcomm looking at in AI?
ZIAD ASGHAR: Oh, absolutely, right. So think about it. If you think from the perspective of privacy, if you think from the perspective of the contextual data that's available on a device, to be able to run these generative AI use cases, large language models on the device, it completely changes the story. It allows people to be able to run these devices while keeping those queries private.
To be able to generate that experience even when you are not connected. To be able to use the contextual information in a way that, basically, you can give a far better experience to the user than is possible today. So let's take an example, right. So if you are talking about, you're creating a virtual world around you. If you already know the kind of textures, the colors, the themes that you like, we can actually optimize that experience for you.
So the experience is better, the privacy is basically preserved, you have the experience, whether you're connected or not, and essentially, the experience is far better. So we really think this brings the generative AI opportunity to the edge and really move the center of gravity towards edge devices. So it's a massive opportunity for us at Qualcomm.
- Yeah, I've heard you say over and over about the experience here. It feels like we're just really at the tip of experiencing generative AI. What does that future look like? When you talk about user experience on your smartphone, what are we talking about?
ZIAD ASGHAR: Let's take an example, right. So on the smartphone right now, you take a picture, already, we are using AI to be able to enhance the quality of that image. But what we can do now is, you can literally talk into your smartphone and you can say, well, put me in front of the Eiffel Tower at sunset with the beautiful clouds in the distance. And we can go and run a text two image model that actually generates that background and puts you right in front of it, right. That sort of an experience. That sort of an interactive experience on devices is really truly only possible if you can do this on the device.
And what makes this possible is our special hardware. Our pedigree is about doing more processing at the lowest power possible, and that's precisely what we're able to do with AI too. And we can essentially do that on a device that's not plugged, whereas some of our other competitors run it on graphics engines that takes hundreds of Watts of power, we literally do the experiences with milliwatts of power.
And that's what takes it to the masses. That's what takes it to every user that uses a smartphone at this point in time.
ALEXANDRA GARFINKLE: Ziad, we've heard a lot about AI risks in the context, for instance, of ChatGPT, which kicked all this off. But I'd love to hear a little bit about how AI risks look from your vantage point.
ZIAD ASGHAR: Yeah, so I think one of the main concerns that people have with some of the generative AI examples is privacy. You have seen some of the news come out recently about certain information getting out and staying and residing on these cloud servers.
Well, with AI-- generative AI running on the device, that query, that question that you ask from a text or text-like example, stays precisely on the device. It never leaves it. So actually, we are able to address some of the concerns that people have. Of course, additionally, the models have continued to get better. That will actually help with doing a lot more, in terms of improving the experience and taking away some of the concerns that people have with these generative AI use cases.
And of course, what is also going to happen is that people will actually get to use these experiences. I think this is a seminal moment in our industry because it really changes the way we interact with our products, with our devices, and this is not one of those things that kind of follow the hype curve. This is actually an example where we have customers coming up to us today and saying, I want to run these use cases on the device. So this is very true, and this is where we believe that we'll be able to actually address a lot of the concerns that people have.
Another big, very quick, concern that people have is the cost, right. If you can imagine that these models are getting larger, the applications are growing, at the same time, a lot more people are using them, you will get to a point where you will not have enough resources in the cloud. So you actually need the edge to be able to do generative AI. And that's the unique opportunity for us.
ALEXANDRA GARFINKLE: So I'd love to take this a step further then and say, is it possible that the technology you're talking about developing at Qualcomm will make ChatGPT and apps like it more accessible and less costly?
ZIAD ASGHAR: Absolutely. Right, so what we can do today, what we showed at some of our recent conferences, was a $1 billion parameter model, what we call stable diffusion running entirely on the device. We showed it today at BuildWare, that same model is now running on a PC.
Now what we have a roadmap for is to be able to do up to $10 billion parameters models on the device. And you can envision some of the models that are out there today, text to text-like models. And we see on a daily basis, new modalities coming into it.
Text to music, text to audio, image to video. All of those-- many of those, actually, fit very well in this range that I'm talking about, which we believe can run comfortably on the device in a sustained fashion. And that's why it comes to everybody. It comes to anybody and everybody that has a smartphone, or a PC, or an XR device not just connected to the cloud at this point in time.
- Qualcomm's Ziad Asghar, good to talk to you today. Fascinating to think about all the applications for AI. Our thanks to Ali Garfinkel as well for joining in on the conversation.
Qualcomm exec: AI adoption becoming 'seminal moment' in tech
Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.uk.news.yahoo.com
I'm pretty confident I've read a MF review where someone subscribed to MF for a year and put cash into each recommendation they made throughout that year.We should do an analysis on how their recommended stocks perform and write an article whenever it goes down and also include a sentence on why we recommend brn before they invest on their recommendation lol
Is Qualcomm really doing this without us????? What's their magic sauce????
Scroll back up to restore default view.
Yahoo Finance
Wed, 24 May 2023 at 7:14 am GMT+10
Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.
Video transcript
- Well, Qualcomm is building on its partnership with Microsoft today, unveiling new features to scale on-device AI. That announcement at Microsoft's Build Developers Conference means further integration of Qualcomm's Snapdragon in Windows devices. Qualcomm saying in a statement, quote, "On-device generative AI solutions allow developers and cloud service providers to make generative AI more affordable, reliable, and private by moving queries and inferences to edge devices, including PCs and phones."
Joining us to discuss is Ziad Asghar, Qualcomm's senior vice president of product management. We've also got our very own Ali Garfinkel joining in on the conversation. Ziad good to talk to you. The announcement today really does build on an existing partnership you've already had with Microsoft. Talk to me about the scale of the opportunity you see in this partnership centered around AI with Microsoft.
ZIAD ASGHAR: I think this generative AI opportunity is just amazing for Qualcomm. You have to understand that this is how we're able to bring all of these experiences onto the devices that are in people's hands. Right now, many of these generative experiences are sitting in the cloud, so people, at large, are not able to access these experiences.
But now whether you think about productivity applications on PC-like devices or other entertainment-like applications on smartphones, with this change that's coming right now, with the technology that we have been building, we can actually bring all these experiences to the edge devices. To your PCs, to XR products, to smartphones, to IoT products, and that's really the amazing opportunity for Qualcomm. And we believe we have some unique differentiated technology that sets us apart from everybody else in this space.
ALEXANDRA GARFINKLE: Ziad, Ali here. Investors aren't necessarily thinking of Qualcomm as an AI stock, but should they? How big of an opportunity is Qualcomm looking at in AI?
ZIAD ASGHAR: Oh, absolutely, right. So think about it. If you think from the perspective of privacy, if you think from the perspective of the contextual data that's available on a device, to be able to run these generative AI use cases, large language models on the device, it completely changes the story. It allows people to be able to run these devices while keeping those queries private.
To be able to generate that experience even when you are not connected. To be able to use the contextual information in a way that, basically, you can give a far better experience to the user than is possible today. So let's take an example, right. So if you are talking about, you're creating a virtual world around you. If you already know the kind of textures, the colors, the themes that you like, we can actually optimize that experience for you.
So the experience is better, the privacy is basically preserved, you have the experience, whether you're connected or not, and essentially, the experience is far better. So we really think this brings the generative AI opportunity to the edge and really move the center of gravity towards edge devices. So it's a massive opportunity for us at Qualcomm.
- Yeah, I've heard you say over and over about the experience here. It feels like we're just really at the tip of experiencing generative AI. What does that future look like? When you talk about user experience on your smartphone, what are we talking about?
ZIAD ASGHAR: Let's take an example, right. So on the smartphone right now, you take a picture, already, we are using AI to be able to enhance the quality of that image. But what we can do now is, you can literally talk into your smartphone and you can say, well, put me in front of the Eiffel Tower at sunset with the beautiful clouds in the distance. And we can go and run a text two image model that actually generates that background and puts you right in front of it, right. That sort of an experience. That sort of an interactive experience on devices is really truly only possible if you can do this on the device.
And what makes this possible is our special hardware. Our pedigree is about doing more processing at the lowest power possible, and that's precisely what we're able to do with AI too. And we can essentially do that on a device that's not plugged, whereas some of our other competitors run it on graphics engines that takes hundreds of Watts of power, we literally do the experiences with milliwatts of power.
And that's what takes it to the masses. That's what takes it to every user that uses a smartphone at this point in time.
ALEXANDRA GARFINKLE: Ziad, we've heard a lot about AI risks in the context, for instance, of ChatGPT, which kicked all this off. But I'd love to hear a little bit about how AI risks look from your vantage point.
ZIAD ASGHAR: Yeah, so I think one of the main concerns that people have with some of the generative AI examples is privacy. You have seen some of the news come out recently about certain information getting out and staying and residing on these cloud servers.
Well, with AI-- generative AI running on the device, that query, that question that you ask from a text or text-like example, stays precisely on the device. It never leaves it. So actually, we are able to address some of the concerns that people have. Of course, additionally, the models have continued to get better. That will actually help with doing a lot more, in terms of improving the experience and taking away some of the concerns that people have with these generative AI use cases.
And of course, what is also going to happen is that people will actually get to use these experiences. I think this is a seminal moment in our industry because it really changes the way we interact with our products, with our devices, and this is not one of those things that kind of follow the hype curve. This is actually an example where we have customers coming up to us today and saying, I want to run these use cases on the device. So this is very true, and this is where we believe that we'll be able to actually address a lot of the concerns that people have.
Another big, very quick, concern that people have is the cost, right. If you can imagine that these models are getting larger, the applications are growing, at the same time, a lot more people are using them, you will get to a point where you will not have enough resources in the cloud. So you actually need the edge to be able to do generative AI. And that's the unique opportunity for us.
ALEXANDRA GARFINKLE: So I'd love to take this a step further then and say, is it possible that the technology you're talking about developing at Qualcomm will make ChatGPT and apps like it more accessible and less costly?
ZIAD ASGHAR: Absolutely. Right, so what we can do today, what we showed at some of our recent conferences, was a $1 billion parameter model, what we call stable diffusion running entirely on the device. We showed it today at BuildWare, that same model is now running on a PC.
Now what we have a roadmap for is to be able to do up to $10 billion parameters models on the device. And you can envision some of the models that are out there today, text to text-like models. And we see on a daily basis, new modalities coming into it.
Text to music, text to audio, image to video. All of those-- many of those, actually, fit very well in this range that I'm talking about, which we believe can run comfortably on the device in a sustained fashion. And that's why it comes to everybody. It comes to anybody and everybody that has a smartphone, or a PC, or an XR device not just connected to the cloud at this point in time.
- Qualcomm's Ziad Asghar, good to talk to you today. Fascinating to think about all the applications for AI. Our thanks to Ali Garfinkel as well for joining in on the conversation.
Qualcomm exec: AI adoption becoming 'seminal moment' in tech
Qualcomm SVP of Product Management Ziad Asghar joins Yahoo Finance Live to discuss Qualcomm's partnership with Microsoft to develop a new AI chip, competition in the AI technology sector, privacy, and data security.uk.news.yahoo.com
Missed the point?I think you are missing the point... CBA is one of the largest financial institutions in the world... banking industry is heavily regulated and the responsibilities of the executives are in a different level... not to mention they are making record high profits... vs nothing for our company... lol
The point is, we have been paying very competitively but the work done by our executives are yet to be reflected / proven on our financial statements... if anything we can only rely on current share price which is a huge disappointment...
Many companies and their exec make lots of promises but fail hard, although I am not saying our one is, but market thinks otherwise... hopefully market is wrong and shorters will get fked like shorters of Nvidia lol
He wasn't entirely wrong if he's specifically talking about non-executive directors, the CBA and BRN payments to NED's are very similar in total. The difference between the two being CBA NED's are paid in primarily cash where as the BRN NED's are paid primarily equity. Very common for start ups/pre-revenue companies.Missed the point?
The point was that Brainchips Management/Directors were earning more than their counterparts in CBA and BHP.
That point has been proven false by @SERA2g above.
You're now trying to spin it to sound like something else I've got 2 words for ya mate but I'll probably get a warning from Dreadd for it. So I'll leave it up to your imagination to figure it out.
The point was that the management / directors gets paid shitload and to prove that point he compared it with executives from large corps which was incorrect. But I think point is still valid IMO.Missed the point?
The point was that Brainchips Management/Directors were earning more than their counterparts in CBA and BHP.
That point has been proven false by @SERA2g above.
You're now trying to spin it to sound like something else I've got 2 words for ya mate but I'll probably get a warning from Dreadd for it. So I'll leave it up to your imagination to figure it out