AI could gobble up a quarter of all electricity in the U.S. by 2030 if it doesn’t break its energy addiction, says Arm Holdings exec
Before artificial intelligence can transform society, the technology will first have to learn how to live within its means.
Right now generative AI has an “insatiable demand” for electricity to power the tens of thousands of compute clusters needed to operate large language models like OpenAI’s GPT-4, warned chief marketing officer Ami Badani of chip design firm Arm Holdings.
If generative AI is ever going to be able to run on every mobile device from a laptop and tablet to a smartphone, it will have to be able to scale without overwhelming the electricity grid at the same time.
“We won’t be able to continue the advancements of AI without addressing power,” Badani told the Fortune Brainstorm AI conference in London on Monday. “ChatGPT requires 15 times more energy than a traditional web search.”
Not only are more businesses using generative AI, but the tech industry is in a race to develop new and more powerful tools that will mean compute demand is only going to grow—and power consumption with it, unless something can be done.
The latest breakthrough from OpenAI, the company behind ChatGPT, is Sora. It can create super realistic or stylized clips of video footage up to 60 seconds in length purely based on user text prompts.
The marvel of gen AI comes at a steep cost
“It takes 100,000 AI chips working at full compute capacity and full power consumption in order to train Sora,” Badani said. “That’s a huge amount.”
Data centers, where most AI models are trained, currently account for 2% of global electricity consumption, according to Badani. But with generative AI expected to go mainstream, she predicts it could end up devouring a quarter of all power in the United States in 2030.
The solution to this conundrum is to develop semiconductor chips that are optimized to run on a minimum of energy.
That’s where Arm comes in: Its RISC processor designs currently run on 99% of all smartphones, as opposed to the rival x86 architecture developed by Intel. The latter has been a standard for desktop PCs, but proved too inefficient to run battery-powered handheld devices like smartphones and tablets.
Arm is adopting that same design philosophy for AI.
“If you think about AI, it comes with a cost,” Badani said, “and that cost is unfortunately power.”
An end-to-end system that enables autonomous vehicles to take control of all aspects of motorway driving with real-time transmission of dynamic maps of toll plazas
Market mainly just treading water today.When’s this stock gonna move?!?
So is there a chance Apple is utilising Akida?https://www.forbes.com/
iOS 18—Apple Issues New Blow To Google With Bold AI Privacy Decision
Kate O'Flaherty
Senior Contributor
Cybersecurity and privacy journalist
Follow
Apr 15, 2024,11:05am EDT
Apple has just dealt a new blow to arch rival Google after making an AI decision that will appeal to all iPhone users. According to Bloomberg’s Mark Gurman, Apple’s iOS 18 AI capabilities will function entirely on the device—your iPhone—so there is no need for cloud processing.
The iOS 18 AI move is a huge win if you care about iPhone privacy, but it isn’t surprising, given that Apple is known for its strong focus in the area. It also sends a strong message to Apple’s biggest rival Google’s Android that the iPhone maker will do everything it can to win in the AI battlefield as competition ramps up.
If AI is the next privacy battlefield, Apple’s latest decision is pitting itself to win against its ... [+]
GETTY IMAGES
Apple’s iOS 18 is due to launch at its Worldwide Developers Conference in June, along with powerful new iPhone AI features including enhanced Siri and auto summarising. The iPhone maker is also set to reveal its AI strategy in more detail.
“As the world awaits Apple’s big AI unveiling on June 10, it looks like the initial wave of features will work entirely on-device,” Gurman wrote. “That means there’s no cloud processing component to the company’s large language model (LLM), the software that powers the new capabilities.”
This is model that will power the auto summarising features key to Apple’s AI strategy and hopes of standing out when it unveils iOS 18.
Why On Device Is Better For iPhone Privacy
On-device processing is far superior to cloud, simply because data doesn’t leave the iPhone.“Processing AI commands on-device means users have more comfort in the knowledge that their input requests are less likely to be monitored and analysed by Apple and third parties,” says Jake
However, Apple’s AI capabilities in iOS 18 and beyond will require a huge amount of data processing power. The iPhone maker has been investing in more hardware able to host AI, and the iPhone 16 will apparently come with an enhanced neural engine. Moore says the next generation iPhones are “more powerful than ever” and “clearly able to handle these large requests.”
And when AI requests are generated in the cloud there is much more potential of data collection and misuse by those owning the large language models, Moore says. “Apple is clearly thinking that as all devices increase in power over time, AI on device is more likely to be the norm, especially for those who are privacy aware.”
Apple Vs Google—The AI Privacy Battlefield
The AI battlefield is ramping up and it’s clear Apple sees privacy as an area that can help it to win as iOS 18 is unveiled in June. And Apple’s iOS 18 AI move is a clear blow to Google and Android devices in general.
Yet running LLMs similar to ChatGPT without additional cloud support will be a challenge, says Android Authority, pointing out that “some of Samsung’s and Google’s most sophisticated AI features still require the power of cloud servers.”
Samsung’s Galaxy AI includes some offline capabilities in a “hybrid AI” approach, while Google’s Gemini Nano AI model is for on-device, according to Android Authority.
The on-device only approach will certainly make Apple stand out, if the iPhone maker can pull this off in iOS 18. It helps that Apple owns the hardware and software and platform of course, but at least some of the features are likely to be limited to the iPhone 16.
The next 12 months are an important time for Apple, after the iPhone maker was trumped by Samsung to become the biggest smartphone maker in the world. Apple's smartphone shipments dropped about 10% in the first quarter of 2024, as Samsung shot to 20.8% market share, research from analyst IDC shows.
It is against this uncertain backdrop that Apple’s iOS 18 AI features will need to help the iPhone maker stand out—and the firm must be confident its devices can stand the pressure of powering AI LLMs. However, Apple can always change its mind, or give users the choice to give away a little of their privacy for extra AI functionality in the future, beyond iOS 18.
Why On Device Is Better For iPhone Privacy
On-device processing is far superior to cloud, simply because data doesn’t leave the iPhone.“Processing AI commands on-device means users have more comfort in the knowledge that their input requests are less likely to be monitored and analysed by Apple and third parties,” says Jake Moore, global cybersecurity advisor at ESET.
iOS 18—Apple Issues New Blow To Google With Bold AI Privacy Decision
If AI the next privacy battlefield, Apple’s latest iOS 18 decision is pitting itself to win against its biggest smartphone rival Google. Here’s what you need to know.www.forbes.com
Say your prayers and send a new pair of Nikes to Bravo!So is there a chance Apple is utilising Akida?
https://www.forbes.com/
iOS 18—Apple Issues New Blow To Google With Bold AI Privacy Decision
Kate O'Flaherty
Senior Contributor
Cybersecurity and privacy journalist
Follow
Apr 15, 2024,11:05am EDT
Apple has just dealt a new blow to arch rival Google after making an AI decision that will appeal to all iPhone users. According to Bloomberg’s Mark Gurman, Apple’s iOS 18 AI capabilities will function entirely on the device—your iPhone—so there is no need for cloud processing.
The iOS 18 AI move is a huge win if you care about iPhone privacy, but it isn’t surprising, given that Apple is known for its strong focus in the area. It also sends a strong message to Apple’s biggest rival Google’s Android that the iPhone maker will do everything it can to win in the AI battlefield as competition ramps up.
If AI is the next privacy battlefield, Apple’s latest decision is pitting itself to win against its ... [+]
GETTY IMAGES
Apple’s iOS 18 is due to launch at its Worldwide Developers Conference in June, along with powerful new iPhone AI features including enhanced Siri and auto summarising. The iPhone maker is also set to reveal its AI strategy in more detail.
“As the world awaits Apple’s big AI unveiling on June 10, it looks like the initial wave of features will work entirely on-device,” Gurman wrote. “That means there’s no cloud processing component to the company’s large language model (LLM), the software that powers the new capabilities.”
This is model that will power the auto summarising features key to Apple’s AI strategy and hopes of standing out when it unveils iOS 18.
Why On Device Is Better For iPhone Privacy
On-device processing is far superior to cloud, simply because data doesn’t leave the iPhone.“Processing AI commands on-device means users have more comfort in the knowledge that their input requests are less likely to be monitored and analysed by Apple and third parties,” says Jake
However, Apple’s AI capabilities in iOS 18 and beyond will require a huge amount of data processing power. The iPhone maker has been investing in more hardware able to host AI, and the iPhone 16 will apparently come with an enhanced neural engine. Moore says the next generation iPhones are “more powerful than ever” and “clearly able to handle these large requests.”
And when AI requests are generated in the cloud there is much more potential of data collection and misuse by those owning the large language models, Moore says. “Apple is clearly thinking that as all devices increase in power over time, AI on device is more likely to be the norm, especially for those who are privacy aware.”
Apple Vs Google—The AI Privacy Battlefield
The AI battlefield is ramping up and it’s clear Apple sees privacy as an area that can help it to win as iOS 18 is unveiled in June. And Apple’s iOS 18 AI move is a clear blow to Google and Android devices in general.
Yet running LLMs similar to ChatGPT without additional cloud support will be a challenge, says Android Authority, pointing out that “some of Samsung’s and Google’s most sophisticated AI features still require the power of cloud servers.”
Samsung’s Galaxy AI includes some offline capabilities in a “hybrid AI” approach, while Google’s Gemini Nano AI model is for on-device, according to Android Authority.
The on-device only approach will certainly make Apple stand out, if the iPhone maker can pull this off in iOS 18. It helps that Apple owns the hardware and software and platform of course, but at least some of the features are likely to be limited to the iPhone 16.
The next 12 months are an important time for Apple, after the iPhone maker was trumped by Samsung to become the biggest smartphone maker in the world. Apple's smartphone shipments dropped about 10% in the first quarter of 2024, as Samsung shot to 20.8% market share, research from analyst IDC shows.
It is against this uncertain backdrop that Apple’s iOS 18 AI features will need to help the iPhone maker stand out—and the firm must be confident its devices can stand the pressure of powering AI LLMs. However, Apple can always change its mind, or give users the choice to give away a little of their privacy for extra AI functionality in the future, beyond iOS 18.
Why On Device Is Better For iPhone Privacy
On-device processing is far superior to cloud, simply because data doesn’t leave the iPhone.“Processing AI commands on-device means users have more comfort in the knowledge that their input requests are less likely to be monitored and analysed by Apple and third parties,” says Jake Moore, global cybersecurity advisor at ESET.
iOS 18—Apple Issues New Blow To Google With Bold AI Privacy Decision
If AI the next privacy battlefield, Apple’s latest iOS 18 decision is pitting itself to win against its biggest smartphone rival Google. Here’s what you need to know.www.forbes.com
My take is I doubt it, but if so we are all millionaires and 1 or 2 billionaires...So is there a chance Apple is utilising Akida?
It’s not new.. it’s a facelift! We had it on beginning already!Sign Up | LinkedIn
500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.www.linkedin.com
Lovin this new Chippa Avatar ……
View attachment 61039
All expectations of dealing with them so far, have resulted in the bitter taste of a cooking Apple..All that glitters ...
Exclusive: Apple acquires Xnor.ai, edge AI spin-out from Paul Allen’s AI2, for price in $200M range
BY ALAN BOYLE, TAYLOR SOPER & TODD BISHOP on January 15, 2020
Apple buys Xnor.ai, an edge-centric AI2 spin-out, for price in $200M range (geekwire.com)
Apple has acquired Xnor.ai, a Seattle startup specializing in low-power, edge-based artificial intelligence tools, sources with knowledge of the deal told GeekWire.
The acquisition echoes Apple’s high-profile purchase of Seattle AI startup Turi in 2016. Speaking on condition of anonymity, sources said Apple paid an amount similar to what was paid for Turi, in the range of $200 million.
Xnor.ai didn’t immediately respond to our inquiries, while Apple emailed us its standard response on questions about acquisitions: “Apple buys smaller technology companies from time to time and we generally do not discuss our purpose or plans.” (The company sent the exact same response when we broke the Turi story.
…
The arrangement suggests that Xnor’s AI-enabled image recognition tools could well become standard features in future iPhones and webcams.
Xnor.ai’s acquisition marks a big win for the Allen Institute for Artificial Intelligence, or AI2, created by the late Microsoft co-founder Paul Allen to boost AI research. It was the second spin-out from AI2’s startup incubator, following Kitt.ai, which was acquired by the Chinese search engine powerhouse Baidu in 2017 for an undisclosed sum.
The deal is a big win as well for the startup’s early investors, including Seattle’s Madrona Venture Group; and for the University of Washington, which serves as a major source of Xnor.ai’s talent pool.
The three-year-old startup’s secret sauce has to do with AI on the edge — machine learning and image recognition tools that can be executed on low-power devices rather than relying on the cloud. “We’ve been able to scale AI out of the cloud to every device out there,” co-founder Ali Farhadi, who is the venture’s CXO (chief Xnor officer) as well as a UW professor, told GeekWire in 2018.
This Apple patent is for compressing AI models. one of the inventors is ex-Xnor.
US11651192B2 Compressed convolutional neural network models 20190212 Rastegari nee Xnor
Systems and processes for training and compressing a convolutional neural network model include the use of quantization and layer fusion. Quantized training data is passed through a convolutional layer of a neural network model to generate convolutional results during a first iteration of training the neural network model. The convolutional results are passed through a batch normalization layer of the neural network model to update normalization parameters of the batch normalization layer. The convolutional layer is fused with the batch normalization layer to generate a first fused layer and the fused parameters of the fused layer are quantized. The quantized training data is passed through the fused layer using the quantized fused parameters to generate output data, which may be quantized for a subsequent layer in the training iteration.
[0018] A convolutional neural network (CNN) model may be designed as a deep learning tool capable of complex tasks such as image classification and natural language processing. CNN models typically receive input data in a floating point number format and perform floating point operations on the data as the data progresses through different layers of the CNN model. Floating point operations are relatively inefficient with respect to power consumed, memory usage and processor usage. These inefficiencies limit the computing platforms on which CNN models can be deployed. For example, field-programmable gate arrays (FPGA) may not include dedicated floating point modules for performing floating point operations and may have limited memory bandwidth that would be inefficient working with 32-bit floating point numbers.
[0019] As described in further detail below, the subject technology includes systems and processes for building a compressed CNN model suitable for deployment on different types of computing platforms having different processing, power and memory capabilities.
Of course, that does not absolutely preclude the possibility that Apple are running the NN model on a more efficient SoC.
At least because Ai has to be everywhere at the moment, 'no' company can avoid the topic, it has to be worth a try and there is subsidy money and there have to be conditions for investments in future tec and the old one is out of the question. Just yesterday I saw a global analysis of Ai and where it is taking place, in addition to research. I couldn't see Germany at a quick glance. But maybe I'll have another look to see if I can find it again. What interests me are products that can be bought from companies like those in Germany.) Just because companies are running a research lab here and dropping slices by slices of white paper on projects because they have obviously or perhaps lost touch doesn't convince me to be confident that Germany plays a really important role in this topic. So far, I'm taking MB's hub seriously.Why would the German trinity of automotive suppliers each have established their own AI Labs if they only assembled components?
View attachment 61017
View attachment 61018 View attachment 61019
They spend a lot of moolah on R&D in general and also offer all sorts of engineering services:
Hardware Engineering
We develop customized hardware solutions for a variety of industries. Our portfolio ranges from head-up displays up to industrial controls in various sectors.conti-engineering.com
Research & Development - Looking at the Big Picture - ZF
With digital innovations and continuously high investments in research and development, we are among the leading technology companies.www.zf.com
Impressive first look and demonstration, of the new Atlas.Our CTO thinks this is great
Marc Theermann on LinkedIn: Meet our next generation Atlas! A humanoid designed for commercial… | 222 comments
Meet our next generation Atlas! A humanoid designed for commercial applications. We are taking all we have learned over 15 years of research on our hydraulic… | 222 comments on LinkedInwww.linkedin.com
View attachment 61029 View attachment 61030
Giving me Buzz vibesSign Up | LinkedIn
500 million+ members | Manage your professional identity. Build and engage with your professional network. Access knowledge, insights and opportunities.www.linkedin.com
Lovin this new Chippa Avatar ……
View attachment 61039