Rise from the ashes
Regular
Who else has been throwing around the word "UBIQUITOUS" lately ?
Haha, must have come out of my gob at least 50 times this week.
Haha, must have come out of my gob at least 50 times this week.
Hi Q - don't think there was any specific conversation of this nature around the AGM. I got the sense that the company has moved on from this - that we won't be producing chips for sale of that nature. Anything at all produced will be for marketing and sales purposes. So IP or we can't do it. So your summary about the bed seems entirely apt! My take only.....Hey V,
Point four, IMO, should be taken in context with the recent statement made by team BRN (PVDM, I think) that customers requiring volumes over 1M are encouraged to work with IP rather than BRN silicon.
I take this to mean that BRN will no longer get out of bed for less than this volume...
Q
Look at these class names and we are still only just looking at the tip of our berg.
So many arena's that we can improve have already been mentioned or hinted at on this site, just with our first iteration, and to my mind, we are still just getting started. Ubiquitous is such a lovely word, and it seems to me that probably the vast percentage of our use cases haven't even been thought of, or considered properly yet, except perhaps by Peter and his fellow visionary's.
The market will eventually catch on to what we already know or suspect, and will be chasing our shares even harder than it already is.
When this juggernaut takes off again, and the adrenalin and excitement has us in thrall it would be wise to be prepared with a plan that takes into account our individual circumstance and objectives so we are each set up as we would prefer, rather than as buck fever dictates.
Unicorns don't come around too often, or may go unrecognised till too late in the day, so don't trade 'em away too cheap guy's n gals.
One day, she may just take off and re rate, leaving some weeping.
Not financial advise yaddah yaddah.....
AKIDA BALLISTA
AKIDA EVERYWHERE.................isn't that just another way to say ubiquitous???
GLTAH
And particularly us long termer's and old timers .......we deserve it.![]()
No matter what our individual decisions - let's all hang on to a parcel so we can attend the AGM and crawl the pubs for strategic and thoughful investment decisions (not just the deductions lol). Great to meet you and see you next year!Look at these class names and we are still only just looking at the tip of our berg.
So many arena's that we can improve have already been mentioned or hinted at on this site, just with our first iteration, and to my mind, we are still just getting started. Ubiquitous is such a lovely word, and it seems to me that probably the vast percentage of our use cases haven't even been thought of, or considered properly yet, except perhaps by Peter and his fellow visionary's.
The market will eventually catch on to what we already know or suspect, and will be chasing our shares even harder than it already is.
When this juggernaut takes off again, and the adrenalin and excitement has us in thrall it would be wise to be prepared with a plan that takes into account our individual circumstance and objectives so we are each set up as we would prefer, rather than as buck fever dictates.
Unicorns don't come around too often, or may go unrecognised till too late in the day, so don't trade 'em away too cheap guy's n gals.
One day, she may just take off and re rate, leaving some weeping.
Not financial advise yaddah yaddah.....
AKIDA BALLISTA
AKIDA EVERYWHERE.................isn't that just another way to say ubiquitous???
GLTAH
And particularly us long termer's and old timers .......we deserve it.![]()
This article has a bit more info.A Apple senior executive who was working on their electric vehicle project and who was previously at Tesla working on their semi autonomous drive system has reportedly jumped ship to work for Luminar.
Luminar......Mercedes......Akida... dot dot dot
![]()
Apple car project executive jumps ship after nine months
Christopher 'CJ' Moore has reportedly left the Apple car project, moving to Luminar Technologies to work on autonomous driving tech.www.drive.com.au
Things that make me go mmmmmm. NPU or is it the co-processor that is mentioned? Any thoughts about the potential
Thanks TT,To
An article along the same lines of Microsoft, Arm, NPU's and co proccessors.
www.pcworld.com/article/704538/microsoft-thinks-cloud-powered-arm-devices-are-the-pcs-future.html/amp
Microsoft’s vision for the PC’s future: AI-infused ‘NPUs’ and the cloud
Will the PC transition from powerful processors to AI-powered, cloud-connected devices? Microsoft seems to hope so.
By Mark Hachman
Senior Editor, PCWorld MAY 24, 2022 3:10 PM PDT
We’ve become used to sharing files between the cloud and our local PCs, so much that we often don’t think about where a file physically resides. It sounds like Microsoft wants us to start thinking about our computing resources in the same way — and start considering AI when buying a new PC or processor, too.
In the future, an app you use will make a decision between whether to use the power of your local CPU; a local “neural processing unit,” or NPU; or the cloud to get your task done fast.
“We’re entering a world where every Windows computer will draw on the combined power of CPUs, GPUs, NPUs, and even a new coprocessor, Azure Compute, in this hybrid cloud-to-edge world,” said Microsoft chief executive Satya Nadella during his keynote address at the Microsoft Build conference, held virtually. “You will be able to do large scale training in the cloud and do inference at the edge and have the fabric work as one.”
What does this mean? Today, we buy beefy X86 processors to run games, Microsoft Excel, and other intensive applications on our PCs. What Microsoft is trying to accomplish is to bring new classes of applications to the PC, “magical experiences” that increasingly depend on artificial intelligence. These apps will use the CPUs and GPUs that are already in your PC, but also tap into the resources like the cloud and an AI processor like an NPU to help out, too.
What Nadella is proposing is that developers — Nadella mentioned Adobe by name — build in more intelligent capabilities into their own apps. (Think Photoshop’s “Magic Select” tool, which can, well, magically select an object by guessing that’s what you’re highlighting.) Those apps would be trained and improved in the cloud, teaching the apps through machine learning how to get smarter. They would then be tested — “inferencing,” in AI-speak — on your PC.
In other words, Microsoft sees the evolution of apps as one where the cloud will interact with the local PC in two ways. You’ll save files on your PC (or on a backup drive or USB key) and on OneDrive; and you’ll run apps on your local CPU and GPU, as well as Azure. The wild card will be the NPU, an AI coprocessor that has been somewhat ignored by X86 chipmakers like AMD and Intel, but prioritized by Arm. Deciding between what to use — CPU, GPU, NPU, or the cloud — is what Microsoft calls “the hybrid loop,” and it will be an important decision your Windows PC will have to make. From your perspective, however, it should just work.
More to come at Microsoft Build
At its Build developer conference, executive vice president and chief product officer Panos Panay will talk about a “vision for a world of intelligent hybrid compute, bringing together local
compute on the CPU, GPU, and NPU [neural processing unit] and cloud compute with Azure,” the company’s cloud technology.
“In the future, moving compute workloads between client and cloud will be as dynamic and seamless as moving between Wi-Fi and cellular on your phone today,” Panay wrote in a blog post titled “Create Next Generation Experiences at Scale with Windows,” that accompanied the opening of the Build conference.
“We’re building on the GPU, the CPU, the MPU, and in essence, and in essence, we’re introducing a fourth processor to Windows with Azure Compute — using Azure, one of the world’s most powerful computers, to enable rich local experiences on Windows.” Panay said at the Build conference.
We don’t know exactly what AI-powered apps these will be, but Microsoft itself has provided a few past examples, such as:
So far, all of these applications depend on your local PC’s processor, such as automatic captioning of local video. Others, such as automatic scheduling of meetings in Outlook, can certainly use your own local processing power, but they could also use the Azure cloud that powers Outlook.com. The point is: you don’t know, and you don’t care. It just gets done.
- Automatic authoring of slides and presentations within PowerPoint, based on the content of your slides, as well as a Presenter Coach to manage your pace
- Microsoft Editor, which analyzes your writing to improve grammar and punctuation;
- Machine transcription and translation in Microsoft Teams
- Automatic scheduling and analysis of meetings in Outlook, as well as Insights to tell how much time you spend on certain tasks;
- Automatic live captioning of video, both in Teams as well as your own PC;
- …and
PowerBI, which can work with Excel to identify trends
As to what those applications might be? We don’t know, though Microsoft is clearly trying to rally developers to build those applications using Build as inspiration. We know Microsoft would really love for you to start incorporating its Azure cloud into your computing experiences, even if you don’t think of “Azure” as something you’d sign up for. Microsoft’s Xbox cloud gaming? The “Windows in a cloud,” or Windows 365? Outlook on the Web? All of these heavily depend on Microsoft Azure, and they simply can’t be replicated without a cloud subscription backing them up.
A strong endorsement for Arm
What’s somewhat surprising, however, is that how strongly Microsoft seems to believe that Arm PCs will be necessary to enable this future. “Increasingly, magical experiences powered by AI will require enormous levels of processing power beyond the capabilities of traditional CPU and GPU alone. But new silicon like neural processing units (NPUs) will add expanded capacity for key AI workloads,” Panay wrote.
Put another way, “in the future, the speed of your computer will be measured by the power of its neural processor,” said Microsoft technical fellow Steven Bathiche, at Build.
Both AMD and Intel have made noises about AI capabilities within their processors, beginning with Intel’s 10th-gen “Ice Lake” chip. There, Intel showed off how AI could be used to filter out background noises in conference calls, accelerate photo-editing tools, and more. But AI hasn’t really been a focus in subsequent presentations. For their part, AMD executives mentioned that Ryzen 7000 would have specific AI instructions that they would talk about later, and that was that.
Arm and its licensee Qualcomm, however, have made AI an enormous priority, and its recent Snapdragon 8+ Gen 1 chip contains what Qualcomm calls its 7th-gen AI engine, with a 3rd-gen Sensing Hub that works on a low level to filter out audible noise along with other features. In smartphones, the impact of AI is more immediately felt, with “portrait” images and video using AI to set off the subject from the background, and apply filters. Qualcomm may refer to it as an “AI engine,” but it’s an NPU by any other name. And Microsoft seems to want them on the PC.
In a demonstration, Bathiche showed off how a face tracker algorithm would require more than 20 watts on a traditional X86 CPU, while on an NPU it took only 137 milliwatts.
And how is it going to do all this? It’s not exactly clear. Surprisingly, one name that we haven’t heard anything about is Windows ML, an AI API that we first heard about in 2018 to bring to AI to Windows…then sort of disappeared.
On one hand, Microsoft hasn’t called out Arm specifically as a preferred NPU provider. But we’ve seen a continued push to support Arm over the past few years, dating from the first Windows on Arm implementations to 64-bit app support. Unfortunately, in a world where Microsoft (and customers) didn’t care so much about AI, Qualcomm’s Snapdragon chips were forced to try and differentiate themselves on battery life, an advantage that X86 chips cut into. AMD claimed that the longest-lasting laptop, as measured by MobileMark, now runs on a Ryzen for example.
Microsoft hasn’t given up. At Build, the company announced Project Volterra, a new device powered by Snapdragon chips. (Microsoft representatives declined to comment on the exact specifications.) Volterra, pictured in the image at the top of this story, will be used as a way for developers to create these “magical apps” that use NPUs — on Arm.
“With Project Volterra you will be able to explore many AI scenarios,” Panay wrote. “And because we expect to see NPUs being built into most, if not all future computing devices, we’re going to make it easy for developers to leverage these new capabilities, by baking support for NPUs into the end-to-end Windows platform.”
Right now, you’re probably working on a PC that uses an X86 processor of some type, either from AMD or Intel. And you probably rely on the power of that PC to accomplish whatever task you wish to complete. Microsoft certainly doesn’t dictate the future of the personal computer, as we’ve learned from the failures of Windows 10X, the Surface Neo, and so on. But it does have significant influence, and the power of the cloud has already touched your life in subtle but meaningful ways. Microsoft appears to be trying to tug the PC into a direction that puts cloud connectivity plus local AI at the head of the table, with Azure as the main dish.
Will the PC industry follow suit? PC makers have generally been willing to experiment with Cortana buttons and Surface-like tablets and Windows 10 in S Mode and the like. But they’re quick to steer back towards what makes them money, too, ruthlessly cutting experiments that don’t pan out. Still, there’s no denying that Microsoft has staked a claim with a fresh vision for the future of the PC, and that’s worth watching.
TT
To
An article along the same lines of Microsoft, Arm, NPU's and co proccessors.
www.pcworld.com/article/704538/microsoft-thinks-cloud-powered-arm-devices-are-the-pcs-future.html/amp
Microsoft’s vision for the PC’s future: AI-infused ‘NPUs’ and the cloud
Will the PC transition from powerful processors to AI-powered, cloud-connected devices? Microsoft seems to hope so.
By Mark Hachman
Senior Editor, PCWorld MAY 24, 2022 3:10 PM PDT
We’ve become used to sharing files between the cloud and our local PCs, so much that we often don’t think about where a file physically resides. It sounds like Microsoft wants us to start thinking about our computing resources in the same way — and start considering AI when buying a new PC or processor, too.
In the future, an app you use will make a decision between whether to use the power of your local CPU; a local “neural processing unit,” or NPU; or the cloud to get your task done fast.
“We’re entering a world where every Windows computer will draw on the combined power of CPUs, GPUs, NPUs, and even a new coprocessor, Azure Compute, in this hybrid cloud-to-edge world,” said Microsoft chief executive Satya Nadella during his keynote address at the Microsoft Build conference, held virtually. “You will be able to do large scale training in the cloud and do inference at the edge and have the fabric work as one.”
What does this mean? Today, we buy beefy X86 processors to run games, Microsoft Excel, and other intensive applications on our PCs. What Microsoft is trying to accomplish is to bring new classes of applications to the PC, “magical experiences” that increasingly depend on artificial intelligence. These apps will use the CPUs and GPUs that are already in your PC, but also tap into the resources like the cloud and an AI processor like an NPU to help out, too.
What Nadella is proposing is that developers — Nadella mentioned Adobe by name — build in more intelligent capabilities into their own apps. (Think Photoshop’s “Magic Select” tool, which can, well, magically select an object by guessing that’s what you’re highlighting.) Those apps would be trained and improved in the cloud, teaching the apps through machine learning how to get smarter. They would then be tested — “inferencing,” in AI-speak — on your PC.
In other words, Microsoft sees the evolution of apps as one where the cloud will interact with the local PC in two ways. You’ll save files on your PC (or on a backup drive or USB key) and on OneDrive; and you’ll run apps on your local CPU and GPU, as well as Azure. The wild card will be the NPU, an AI coprocessor that has been somewhat ignored by X86 chipmakers like AMD and Intel, but prioritized by Arm. Deciding between what to use — CPU, GPU, NPU, or the cloud — is what Microsoft calls “the hybrid loop,” and it will be an important decision your Windows PC will have to make. From your perspective, however, it should just work.
More to come at Microsoft Build
At its Build developer conference, executive vice president and chief product officer Panos Panay will talk about a “vision for a world of intelligent hybrid compute, bringing together local
compute on the CPU, GPU, and NPU [neural processing unit] and cloud compute with Azure,” the company’s cloud technology.
“In the future, moving compute workloads between client and cloud will be as dynamic and seamless as moving between Wi-Fi and cellular on your phone today,” Panay wrote in a blog post titled “Create Next Generation Experiences at Scale with Windows,” that accompanied the opening of the Build conference.
“We’re building on the GPU, the CPU, the MPU, and in essence, and in essence, we’re introducing a fourth processor to Windows with Azure Compute — using Azure, one of the world’s most powerful computers, to enable rich local experiences on Windows.” Panay said at the Build conference.
We don’t know exactly what AI-powered apps these will be, but Microsoft itself has provided a few past examples, such as:
So far, all of these applications depend on your local PC’s processor, such as automatic captioning of local video. Others, such as automatic scheduling of meetings in Outlook, can certainly use your own local processing power, but they could also use the Azure cloud that powers Outlook.com. The point is: you don’t know, and you don’t care. It just gets done.
- Automatic authoring of slides and presentations within PowerPoint, based on the content of your slides, as well as a Presenter Coach to manage your pace
- Microsoft Editor, which analyzes your writing to improve grammar and punctuation;
- Machine transcription and translation in Microsoft Teams
- Automatic scheduling and analysis of meetings in Outlook, as well as Insights to tell how much time you spend on certain tasks;
- Automatic live captioning of video, both in Teams as well as your own PC;
- …and
PowerBI, which can work with Excel to identify trends
As to what those applications might be? We don’t know, though Microsoft is clearly trying to rally developers to build those applications using Build as inspiration. We know Microsoft would really love for you to start incorporating its Azure cloud into your computing experiences, even if you don’t think of “Azure” as something you’d sign up for. Microsoft’s Xbox cloud gaming? The “Windows in a cloud,” or Windows 365? Outlook on the Web? All of these heavily depend on Microsoft Azure, and they simply can’t be replicated without a cloud subscription backing them up.
A strong endorsement for Arm
What’s somewhat surprising, however, is that how strongly Microsoft seems to believe that Arm PCs will be necessary to enable this future. “Increasingly, magical experiences powered by AI will require enormous levels of processing power beyond the capabilities of traditional CPU and GPU alone. But new silicon like neural processing units (NPUs) will add expanded capacity for key AI workloads,” Panay wrote.
Put another way, “in the future, the speed of your computer will be measured by the power of its neural processor,” said Microsoft technical fellow Steven Bathiche, at Build.
Both AMD and Intel have made noises about AI capabilities within their processors, beginning with Intel’s 10th-gen “Ice Lake” chip. There, Intel showed off how AI could be used to filter out background noises in conference calls, accelerate photo-editing tools, and more. But AI hasn’t really been a focus in subsequent presentations. For their part, AMD executives mentioned that Ryzen 7000 would have specific AI instructions that they would talk about later, and that was that.
Arm and its licensee Qualcomm, however, have made AI an enormous priority, and its recent Snapdragon 8+ Gen 1 chip contains what Qualcomm calls its 7th-gen AI engine, with a 3rd-gen Sensing Hub that works on a low level to filter out audible noise along with other features. In smartphones, the impact of AI is more immediately felt, with “portrait” images and video using AI to set off the subject from the background, and apply filters. Qualcomm may refer to it as an “AI engine,” but it’s an NPU by any other name. And Microsoft seems to want them on the PC.
In a demonstration, Bathiche showed off how a face tracker algorithm would require more than 20 watts on a traditional X86 CPU, while on an NPU it took only 137 milliwatts.
And how is it going to do all this? It’s not exactly clear. Surprisingly, one name that we haven’t heard anything about is Windows ML, an AI API that we first heard about in 2018 to bring to AI to Windows…then sort of disappeared.
On one hand, Microsoft hasn’t called out Arm specifically as a preferred NPU provider. But we’ve seen a continued push to support Arm over the past few years, dating from the first Windows on Arm implementations to 64-bit app support. Unfortunately, in a world where Microsoft (and customers) didn’t care so much about AI, Qualcomm’s Snapdragon chips were forced to try and differentiate themselves on battery life, an advantage that X86 chips cut into. AMD claimed that the longest-lasting laptop, as measured by MobileMark, now runs on a Ryzen for example.
Microsoft hasn’t given up. At Build, the company announced Project Volterra, a new device powered by Snapdragon chips. (Microsoft representatives declined to comment on the exact specifications.) Volterra, pictured in the image at the top of this story, will be used as a way for developers to create these “magical apps” that use NPUs — on Arm.
“With Project Volterra you will be able to explore many AI scenarios,” Panay wrote. “And because we expect to see NPUs being built into most, if not all future computing devices, we’re going to make it easy for developers to leverage these new capabilities, by baking support for NPUs into the end-to-end Windows platform.”
Right now, you’re probably working on a PC that uses an X86 processor of some type, either from AMD or Intel. And you probably rely on the power of that PC to accomplish whatever task you wish to complete. Microsoft certainly doesn’t dictate the future of the personal computer, as we’ve learned from the failures of Windows 10X, the Surface Neo, and so on. But it does have significant influence, and the power of the cloud has already touched your life in subtle but meaningful ways. Microsoft appears to be trying to tug the PC into a direction that puts cloud connectivity plus local AI at the head of the table, with Azure as the main dish.
Will the PC industry follow suit? PC makers have generally been willing to experiment with Cortana buttons and Surface-like tablets and Windows 10 in S Mode and the like. But they’re quick to steer back towards what makes them money, too, ruthlessly cutting experiments that don’t pan out. Still, there’s no denying that Microsoft has staked a claim with a fresh vision for the future of the PC, and that’s worth watching.
TT
"Put another way, “in the future, the speed of your computer will be measured by the power of its neural processor,” said Microsoft technical fellow Steven Bathiche, at Build."The partnership with ARM on a scale of 1 to 10 in regards to revenue is what ?
Great article and hopefully Microsoft choose Brainchip for their NPU.To
An article along the same lines of Microsoft, Arm, NPU's and co proccessors.
www.pcworld.com/article/704538/microsoft-thinks-cloud-powered-arm-devices-are-the-pcs-future.html/amp
Microsoft’s vision for the PC’s future: AI-infused ‘NPUs’ and the cloud
Will the PC transition from powerful processors to AI-powered, cloud-connected devices? Microsoft seems to hope so.
By Mark Hachman
Senior Editor, PCWorld MAY 24, 2022 3:10 PM PDT
We’ve become used to sharing files between the cloud and our local PCs, so much that we often don’t think about where a file physically resides. It sounds like Microsoft wants us to start thinking about our computing resources in the same way — and start considering AI when buying a new PC or processor, too.
In the future, an app you use will make a decision between whether to use the power of your local CPU; a local “neural processing unit,” or NPU; or the cloud to get your task done fast.
“We’re entering a world where every Windows computer will draw on the combined power of CPUs, GPUs, NPUs, and even a new coprocessor, Azure Compute, in this hybrid cloud-to-edge world,” said Microsoft chief executive Satya Nadella during his keynote address at the Microsoft Build conference, held virtually. “You will be able to do large scale training in the cloud and do inference at the edge and have the fabric work as one.”
What does this mean? Today, we buy beefy X86 processors to run games, Microsoft Excel, and other intensive applications on our PCs. What Microsoft is trying to accomplish is to bring new classes of applications to the PC, “magical experiences” that increasingly depend on artificial intelligence. These apps will use the CPUs and GPUs that are already in your PC, but also tap into the resources like the cloud and an AI processor like an NPU to help out, too.
What Nadella is proposing is that developers — Nadella mentioned Adobe by name — build in more intelligent capabilities into their own apps. (Think Photoshop’s “Magic Select” tool, which can, well, magically select an object by guessing that’s what you’re highlighting.) Those apps would be trained and improved in the cloud, teaching the apps through machine learning how to get smarter. They would then be tested — “inferencing,” in AI-speak — on your PC.
In other words, Microsoft sees the evolution of apps as one where the cloud will interact with the local PC in two ways. You’ll save files on your PC (or on a backup drive or USB key) and on OneDrive; and you’ll run apps on your local CPU and GPU, as well as Azure. The wild card will be the NPU, an AI coprocessor that has been somewhat ignored by X86 chipmakers like AMD and Intel, but prioritized by Arm. Deciding between what to use — CPU, GPU, NPU, or the cloud — is what Microsoft calls “the hybrid loop,” and it will be an important decision your Windows PC will have to make. From your perspective, however, it should just work.
More to come at Microsoft Build
At its Build developer conference, executive vice president and chief product officer Panos Panay will talk about a “vision for a world of intelligent hybrid compute, bringing together local
compute on the CPU, GPU, and NPU [neural processing unit] and cloud compute with Azure,” the company’s cloud technology.
“In the future, moving compute workloads between client and cloud will be as dynamic and seamless as moving between Wi-Fi and cellular on your phone today,” Panay wrote in a blog post titled “Create Next Generation Experiences at Scale with Windows,” that accompanied the opening of the Build conference.
“We’re building on the GPU, the CPU, the MPU, and in essence, and in essence, we’re introducing a fourth processor to Windows with Azure Compute — using Azure, one of the world’s most powerful computers, to enable rich local experiences on Windows.” Panay said at the Build conference.
We don’t know exactly what AI-powered apps these will be, but Microsoft itself has provided a few past examples, such as:
So far, all of these applications depend on your local PC’s processor, such as automatic captioning of local video. Others, such as automatic scheduling of meetings in Outlook, can certainly use your own local processing power, but they could also use the Azure cloud that powers Outlook.com. The point is: you don’t know, and you don’t care. It just gets done.
- Automatic authoring of slides and presentations within PowerPoint, based on the content of your slides, as well as a Presenter Coach to manage your pace
- Microsoft Editor, which analyzes your writing to improve grammar and punctuation;
- Machine transcription and translation in Microsoft Teams
- Automatic scheduling and analysis of meetings in Outlook, as well as Insights to tell how much time you spend on certain tasks;
- Automatic live captioning of video, both in Teams as well as your own PC;
- …and
PowerBI, which can work with Excel to identify trends
As to what those applications might be? We don’t know, though Microsoft is clearly trying to rally developers to build those applications using Build as inspiration. We know Microsoft would really love for you to start incorporating its Azure cloud into your computing experiences, even if you don’t think of “Azure” as something you’d sign up for. Microsoft’s Xbox cloud gaming? The “Windows in a cloud,” or Windows 365? Outlook on the Web? All of these heavily depend on Microsoft Azure, and they simply can’t be replicated without a cloud subscription backing them up.
A strong endorsement for Arm
What’s somewhat surprising, however, is that how strongly Microsoft seems to believe that Arm PCs will be necessary to enable this future. “Increasingly, magical experiences powered by AI will require enormous levels of processing power beyond the capabilities of traditional CPU and GPU alone. But new silicon like neural processing units (NPUs) will add expanded capacity for key AI workloads,” Panay wrote.
Put another way, “in the future, the speed of your computer will be measured by the power of its neural processor,” said Microsoft technical fellow Steven Bathiche, at Build.
Both AMD and Intel have made noises about AI capabilities within their processors, beginning with Intel’s 10th-gen “Ice Lake” chip. There, Intel showed off how AI could be used to filter out background noises in conference calls, accelerate photo-editing tools, and more. But AI hasn’t really been a focus in subsequent presentations. For their part, AMD executives mentioned that Ryzen 7000 would have specific AI instructions that they would talk about later, and that was that.
Arm and its licensee Qualcomm, however, have made AI an enormous priority, and its recent Snapdragon 8+ Gen 1 chip contains what Qualcomm calls its 7th-gen AI engine, with a 3rd-gen Sensing Hub that works on a low level to filter out audible noise along with other features. In smartphones, the impact of AI is more immediately felt, with “portrait” images and video using AI to set off the subject from the background, and apply filters. Qualcomm may refer to it as an “AI engine,” but it’s an NPU by any other name. And Microsoft seems to want them on the PC.
In a demonstration, Bathiche showed off how a face tracker algorithm would require more than 20 watts on a traditional X86 CPU, while on an NPU it took only 137 milliwatts.
And how is it going to do all this? It’s not exactly clear. Surprisingly, one name that we haven’t heard anything about is Windows ML, an AI API that we first heard about in 2018 to bring to AI to Windows…then sort of disappeared.
On one hand, Microsoft hasn’t called out Arm specifically as a preferred NPU provider. But we’ve seen a continued push to support Arm over the past few years, dating from the first Windows on Arm implementations to 64-bit app support. Unfortunately, in a world where Microsoft (and customers) didn’t care so much about AI, Qualcomm’s Snapdragon chips were forced to try and differentiate themselves on battery life, an advantage that X86 chips cut into. AMD claimed that the longest-lasting laptop, as measured by MobileMark, now runs on a Ryzen for example.
Microsoft hasn’t given up. At Build, the company announced Project Volterra, a new device powered by Snapdragon chips. (Microsoft representatives declined to comment on the exact specifications.) Volterra, pictured in the image at the top of this story, will be used as a way for developers to create these “magical apps” that use NPUs — on Arm.
“With Project Volterra you will be able to explore many AI scenarios,” Panay wrote. “And because we expect to see NPUs being built into most, if not all future computing devices, we’re going to make it easy for developers to leverage these new capabilities, by baking support for NPUs into the end-to-end Windows platform.”
Right now, you’re probably working on a PC that uses an X86 processor of some type, either from AMD or Intel. And you probably rely on the power of that PC to accomplish whatever task you wish to complete. Microsoft certainly doesn’t dictate the future of the personal computer, as we’ve learned from the failures of Windows 10X, the Surface Neo, and so on. But it does have significant influence, and the power of the cloud has already touched your life in subtle but meaningful ways. Microsoft appears to be trying to tug the PC into a direction that puts cloud connectivity plus local AI at the head of the table, with Azure as the main dish.
Will the PC industry follow suit? PC makers have generally been willing to experiment with Cortana buttons and Surface-like tablets and Windows 10 in S Mode and the like. But they’re quick to steer back towards what makes them money, too, ruthlessly cutting experiments that don’t pan out. Still, there’s no denying that Microsoft has staked a claim with a fresh vision for the future of the PC, and that’s worth watching.
TT
Thanks so much for this explanation @Diogenese it is I think for me the clearest explanation I have read on how Akida operates and how it achieves power savings in different circumstances. I mean I am non technical and I could understand it haha. Thank you sirHi JK,
Akida's forte is identifying (classifying) input signals from sensors. In simple applications this may be sufficient to trigger a direct response or action.
However, in some cases the Akida output is used as an input to another CPU/GPU (von Neumann processor) to form part of that computer's program variables.
In the first case, the entire process gets the full power saving/speed improvement from Akida.
In the second case, the benefit is the reduction in power/time which Akida brings to the classification task while the CPU performs the remaining processes under the control of its software program. This is important because the classification task carried out on a software controlled CPU uses very large amounts of power and takes a relatively long time.
Classification of an image on a CPU uses CNN (convolutional neural network) processes which involve multiplying multi-bit (8, 16, 32, 63) bytes representing each pixel on the sensor. Multiplication involves the number of computer operations determined by the square of the number of bits in the byte, so an 8-bit byte multiplication would involve, 64 computer operations. For 32-bit bytes, 1024 operations are required to process the output from a single pixel, whether it's value has changed or not.
On the other hand, Akida ignores pixels whose output value does not change, and only performs a computer operation for the pixels whose output changes (an event). This is "sparsity". In addition, in 1-bit mode there is only a singe computer operation for each pixel event.
For example, the sparsity may reduce the number of events by, say, 40%.
Even in 4-bit mode, Akida only needs 16 computer operations, and that only for pixels whose output has changed.
Hence there are large savings in power and time in using Akida to do the classification task compared to using, eg, a 32-bit ARM Cortex microprocessor.
While the rest of the program may be carried out on the microprocessor, this uses comparatively little power compared to the power the microprocessor would have used performing the CNN task. So there are still large power savings to be made by using Akida in "accelerator" mode as an input device for a von Neumann processor.
The other point is that Akida performs its classification independent of any processor with which it is associated. For example, Akida 1000 includes an ARM Cortex processor, but this is only used for configuration of the arrangement of the NPU nodes to optimize performance for the particular task, but the ARM Cortex plays no part in actual classification task. The ARM Cortex does not form part of the Akida IP. Akida is "processor-agnostic" and can operate with any CPU.
Institutional investors understand the significance.ARM over 29 billion semiconductors produced in the last 12 months
Mercedes Benz around 3 million passenger vehicles and if each one had 10 x AKD1000 that is 30 million semiconductors
30 million opportunities -v- 29 billion opportunities.
My opinion only DYOR
FF
AKIDA BALLISTA
2000th message FF, im looking forward to your next 2000 also... we love your work!!Institutional investors understand the significance.
Retail like to stand at the Mercedes Showroom and drool over the amazing EQXX.
I personally drool over both.
If you have properly researched ARM you will too.
My opinion only DYOR
FF
AKIDA BALLISTA