BRN Discussion Ongoing

Bravo

If ARM was an arm, BRN would be its biceps💪!
Could be some opportunities for us here IMO.


VVDN and SecureThings.ai Collaborate to Enhance Cybersecurity for Industry Solutions​

PR Newswire
Wed, November 13, 2024 at 2:00 AM GMT+11 3 min read


FREMONT, Calif., Nov. 12, 2024 /PRNewswire/ -- VVDN Technologies, a global provider of software, electronics engineering, and product manufacturing services and solutions, today announced it has signed an MoU (Memorandum of Understanding) for Cybersecurity collaboration with SecureThings.ai Pvt Ltd, a leader in Vehicle Cybersecurity Solutions. This collaboration represents a major step forward in delivering robust cybersecurity assurance for Automotive, Networking & Wi-Fi, IoT, and Cloud solutions globally, with a special focus on the connected vehicle ecosystem.


VVDN Technologies Logo


The engagement enables the integration of SecureThings.ai's advanced cybersecurity solutions into automotive products and solutions designed and manufactured by VVDN for its customers, which include Vehicle Connectivity, in-vehicle Infotainment and Instrument Clusters, ADAS solutions, Software-Defined Vehicle (SDV) solutions, Network & Wi-Fi Devices, IoT, and Cloud Services & Solutions. The collaboration aims to ensure that VVDN meets global regulatory requirements, including the adoption of ISO 21434 standards for cybersecurity in the automotive industry.


Key initiatives under this collaboration include:

  • 'Intrusion Detection and Protection Solutions' - To monitor, detect, and prevent cyber-attacks in real time
  • 'Threat Intelligence Services' - For continuous monitoring, vulnerability detection, and correlation with products/solutions developed by VVDN for its customers
  • Setting up of a 'Security Research Lab' – To focus on developing customized security concepts to address the cybersecurity requirements of VVDN's customers
  • 'Red Team as a Service' – To be deployed to conduct independent cybersecurity assessments across multiple customer deployments, ensuring comprehensive protection
Both companies are exploring possibilities for extending the collaboration into other allied domains, further expanding the scope of their collaboration. This long-term engagement highlights their shared vision of driving innovation and excellence in cybersecurity for the automotive and IoT sectors.

Vivek Bansal - Co-founder and President of VVDN Technologies: "Cybersecurity is a prime focus for VVDN. This collaboration with SecureThings.ai underscores our commitment to delivering secure and reliable automotive solutions globally. By integrating SecureThings.ai's cutting-edge Cybersecurity expertise, we aim to set new benchmarks in safeguarding connected vehicles and IoT ecosystems. With the landscape of automotive connectivity rapidly evolving and stringent cybersecurity regulations like R 155/156 and AIS 189/190 in India coming into play, it's crucial to fortify every device on the vehicle with robust cyber assurance. SecureThings will bring invaluable cybersecurity expertise, independent insights, and unbiased security concepts to our solutions. This collaboration marks the beginning of a proactive approach in this direction."

 
  • Like
  • Love
  • Fire
Reactions: 26 users

Frangipani

Regular
Vegas, Baby!

CES 2025 is only 8 weeks away…



9A8AB5E2-C399-4056-AEF8-1C5DE3162589.jpeg



AC6CEF51-C89D-446B-B6B6-7BFE16BE98ED.jpeg





EF0847B7-75C9-4FD7-A091-91B19A53E639.jpeg


Here’s hoping we will soon be greeted with a “Sorry, we’re fully booked!” when revisiting this scheduling platform… 😊
 
  • Like
  • Love
  • Fire
Reactions: 27 users

TheDrooben

Pretty Pretty Pretty Pretty Good
Ok Larry will say it........
20240304_120649.gif
 
  • Haha
  • Like
  • Fire
Reactions: 10 users

Boab

I wish I could paint like Vincent
  • Like
  • Haha
Reactions: 8 users

AARONASX

Holding onto what I've got
I have mentioned Google a few times (aka Alphabet Inc) and more and more think they are one to watch as they would benefit from at least testing Akida and more using it, anyone of the ones below is a possibility,

What they own
Waymo - Self driving car
Wing - Drone delivery
Android/Pixel - mobile devices and watches
Fitbit - watches
Nest - home IOT devices, smart home.





 
  • Like
  • Fire
  • Love
Reactions: 15 users

Bravo

If ARM was an arm, BRN would be its biceps💪!

You don't say! 🥳

"When developing Gen AI applications, Gartner suggests that operators should focus on using a minimum amount of computing power and look at the viability of other options such as edge computing and smaller language models."


Gartner: Power Shortages Could Limit 40% of AI Data Centres​


By Amber Jackson
November 13, 2024
6 mins
Share

gettyimages-912000448.webp

Gartner: 40% of existing AI data centres will be constrained by power availability by 2027
According to Gartner, power shortages will restrict 40% of AI data centres by 2027, suggesting Gen AI energy consumption will exceed power utility capacity
AI and generative AI (Gen AI) continue to drive rapid increases in electricity consumption, inevitably putting larger pressures on data centres to deliver.
New predictions from Gartner suggest that such rapid growth in data centre energy consumption to accommodate Gen AI will exceed the capacity of power utilities. The organisation suggests that data centre forecasts over the next two years could reach as high as 160% growth.
“The explosive growth of new hyperscale data centres to implement Gen AI is creating an insatiable demand for power that will exceed the ability of utility providers to expand their capacity fast enough.”
Bob Johnson, VP Analyst at Gartner
Likewise, Gartner suggests that 40% of existing AI data centres will be operationally constrained by power availability by 2027.
AI has been cited as the critical cause of such a power surge in the data centre sector. Given that the technology consumes significant levels of power and energy in order to run, it is expected that power demands will continue to rise as a result, putting strain on the grid.

Confronting the rise of AI​

Gartner estimates the power required for data centres to run incremental AI-optimised servers will reach 500 terawatt-hours (TWh) per year in 2027, which is 2.6 times the level in 2023 as evidenced below.
“The explosive growth of new hyperscale data centres to implement Gen AI is creating an insatiable demand for power that will exceed the ability of utility providers to expand their capacity fast enough,” says Bob Johnson, VP Analyst at Gartner. “In turn, this threatens to disrupt energy availability and lead to shortages, which will limit the growth of new data centres for GenAI and other uses from 2026.”

image001-2.jpg

Estimated Incremental Power Consumption of AI Data Centres, 2022-2027 (Source: Gartner)
He adds: “New larger data centres are being planned to handle the huge amounts of data needed to train and implement the rapidly expanding large language models (LLMs) that underpin GenAI applications.
“However, short-term power shortages are likely to continue for years as new power transmission, distribution and generation capacity could take years to come online and won’t alleviate current problems.”
Data centres already account for roughly 1-1.5% of the total global electricity consumption in 2022, ahead of AI interest spiking. The International Energy Agency (IEA) has indicated in 2024 already that global electricity demand could double by 2026, suggesting an urgent need for data centres to confront their energy usage.
Such an overwhelming demand for AI could also lead to carbon dioxide (CO2) emissions from data centres increasing over the next decade. As a result, data centres must ensure they are harnessing AI in a way that doesn’t threaten their sustainability targets.
In the near future, Gartner suggests that the number of new data centres and the growth of Gen AI will be governed by the availability of power to run them. In response, Gartner recommends organisations determine the risks potential power shortages will have on all products and services.
Inevitably, power shortages will lead to an increase in the price of power - which Gartner says will also increase the costs of operating LLMs.

bob-johnson.jpg

Bob Johnson, VP Analyst at Gartner
“Significant power users are working with major producers to secure long-term guaranteed sources of power independent of other grid demands,” Bob explains. “In the meantime, the cost of power to operate data centres will increase significantly as operators use economic leverage to secure needed power. These costs will be passed on to AI/Gen AI product and service providers as well.”

Evaluating future plans to save sustainability goals​

Data centres require 24/7 power availability, with Gartner noting that renewable sources of power like wind or solar cannot provide without a form of alternative supply.
The firm suggests that reliable 24/7 power can only be generated by either hydroelectric, fossil fuel or nuclear power plants. Therefore, in response to rising emissions, Gartner suggests that businesses must evaluate future plans and anticipate higher power costs.
Many Companies Will Not Meet Their Net Zero Goals, NTT Finds
This is in conjunction with negotiating long-term contracts for data centre services at reasonable rates for power, the organisation explains.
Likewise, Gartner notes that businesses should be factoring in significant cost increases when developing plans for new products and services, whilst also looking for alternative approaches to innovation that require less power.
Youtube Placeholder

Through its research, Gartner has indicated that zero-carbon sustainability goals will be negatively affected by short-term solutions to provide more power. As surging demand is forcing suppliers to increase production, some have kept fossil fuel plants that had been scheduled for retirement in operation beyond their scheduled shutdown.
The company recommends organisations re-evaluate sustainability goals relating to CO2 emissions in light of future data centre requirements and power sources for the next few years.
When developing Gen AI applications, Gartner suggests that operators should focus on using a minimum amount of computing power and look at the viability of other options such as edge computing and smaller language models.
“The reality is that increased data centre use will lead to increased CO2 emissions to generate the needed power in the short-term,” Bob says. “This, in turn, will make it more difficult for data centre operators and their customers to meet aggressive sustainability goals relating to CO2 emissions.”


 
  • Like
  • Fire
  • Love
Reactions: 37 users

7für7

Top 20
Following the daily….no the hourly share price of BRN is like …..something between…


Thinking it’s about to skyrocket vs no it’s just a glitch
1731459017406.gif
 
  • Haha
  • Like
Reactions: 13 users

Esq.111

Fascinatingly Intuitive.
1731462317932.png
 
  • Fire
  • Like
  • Love
Reactions: 12 users

Diogenese

Top 20
Hi MrR,

That's an amazingly quick pick up. Only the abstract available on Espacenet at the moment.



Methods, systems, and apparatus, including computer programs encoded on computer storage media, for recognizing speech using a spiking neural network acoustic model implemented on a neuromorphic processor are described. In one aspect, a method includes receiving, a trained acoustic model implemented as a spiking neural network (SNN) on a neuromorphic processor of a client device, a set of feature coefficients that represent acoustic energy of input audio received from a microphone communicably coupled to the client device. The acoustic model is trained to predict speech sounds based on input feature coefficients. The acoustic model generates output data indicating predicted speech sounds corresponding to the set of feature coefficients that represent the input audio received from the microphone. The neuromorphic processor updates one or more parameters of the acoustic model using one or more learning rules and the predicted speech sounds of
the output data
.

Now that smells like a duck ...
 
  • Like
  • Love
  • Fire
Reactions: 77 users

Mazewolf

Regular
Neuromorphic drone hunter trialled by US company...
Is this news? Akida potential!
 
  • Like
  • Fire
  • Thinking
Reactions: 18 users

TheDrooben

Pretty Pretty Pretty Pretty Good
My name is Larry and I am a premature BOOMer.......might be the reason for my long balls


f4b03ad6-01a9-4ede-b0cc-eea37af1c2de_text (1).gif
 
Last edited:
  • Haha
  • Like
Reactions: 13 users
The waiting for the rocket to launch or the train to leave the station is like waiting for paint to dry or in the case of Sydneyites waiting for a warm sunny November day.

We just need that signing Sean
Com on get it done ✅
Lol
 
  • Like
  • Fire
Reactions: 16 users

TECH

Regular

You don't say! 🥳

"When developing Gen AI applications, Gartner suggests that operators should focus on using a minimum amount of computing power and look at the viability of other options such as edge computing and smaller language models."


Gartner: Power Shortages Could Limit 40% of AI Data Centres​


By Amber Jackson
November 13, 2024
6 mins
Share

gettyimages-912000448.webp

Gartner: 40% of existing AI data centres will be constrained by power availability by 2027
According to Gartner, power shortages will restrict 40% of AI data centres by 2027, suggesting Gen AI energy consumption will exceed power utility capacity
AI and generative AI (Gen AI) continue to drive rapid increases in electricity consumption, inevitably putting larger pressures on data centres to deliver.
New predictions from Gartner suggest that such rapid growth in data centre energy consumption to accommodate Gen AI will exceed the capacity of power utilities. The organisation suggests that data centre forecasts over the next two years could reach as high as 160% growth.

Likewise, Gartner suggests that 40% of existing AI data centres will be operationally constrained by power availability by 2027.
AI has been cited as the critical cause of such a power surge in the data centre sector. Given that the technology consumes significant levels of power and energy in order to run, it is expected that power demands will continue to rise as a result, putting strain on the grid.

Confronting the rise of AI​

Gartner estimates the power required for data centres to run incremental AI-optimised servers will reach 500 terawatt-hours (TWh) per year in 2027, which is 2.6 times the level in 2023 as evidenced below.
“The explosive growth of new hyperscale data centres to implement Gen AI is creating an insatiable demand for power that will exceed the ability of utility providers to expand their capacity fast enough,” says Bob Johnson, VP Analyst at Gartner. “In turn, this threatens to disrupt energy availability and lead to shortages, which will limit the growth of new data centres for GenAI and other uses from 2026.”

image001-2.jpg

Estimated Incremental Power Consumption of AI Data Centres, 2022-2027 (Source: Gartner)
He adds: “New larger data centres are being planned to handle the huge amounts of data needed to train and implement the rapidly expanding large language models (LLMs) that underpin GenAI applications.
“However, short-term power shortages are likely to continue for years as new power transmission, distribution and generation capacity could take years to come online and won’t alleviate current problems.”
Data centres already account for roughly 1-1.5% of the total global electricity consumption in 2022, ahead of AI interest spiking. The International Energy Agency (IEA) has indicated in 2024 already that global electricity demand could double by 2026, suggesting an urgent need for data centres to confront their energy usage.
Such an overwhelming demand for AI could also lead to carbon dioxide (CO2) emissions from data centres increasing over the next decade. As a result, data centres must ensure they are harnessing AI in a way that doesn’t threaten their sustainability targets.
In the near future, Gartner suggests that the number of new data centres and the growth of Gen AI will be governed by the availability of power to run them. In response, Gartner recommends organisations determine the risks potential power shortages will have on all products and services.
Inevitably, power shortages will lead to an increase in the price of power - which Gartner says will also increase the costs of operating LLMs.

bob-johnson.jpg

Bob Johnson, VP Analyst at Gartner
“Significant power users are working with major producers to secure long-term guaranteed sources of power independent of other grid demands,” Bob explains. “In the meantime, the cost of power to operate data centres will increase significantly as operators use economic leverage to secure needed power. These costs will be passed on to AI/Gen AI product and service providers as well.”

Evaluating future plans to save sustainability goals​

Data centres require 24/7 power availability, with Gartner noting that renewable sources of power like wind or solar cannot provide without a form of alternative supply.
The firm suggests that reliable 24/7 power can only be generated by either hydroelectric, fossil fuel or nuclear power plants. Therefore, in response to rising emissions, Gartner suggests that businesses must evaluate future plans and anticipate higher power costs.
Many Companies Will Not Meet Their Net Zero Goals, NTT Finds
This is in conjunction with negotiating long-term contracts for data centre services at reasonable rates for power, the organisation explains.
Likewise, Gartner notes that businesses should be factoring in significant cost increases when developing plans for new products and services, whilst also looking for alternative approaches to innovation that require less power.
Youtube Placeholder

Through its research, Gartner has indicated that zero-carbon sustainability goals will be negatively affected by short-term solutions to provide more power. As surging demand is forcing suppliers to increase production, some have kept fossil fuel plants that had been scheduled for retirement in operation beyond their scheduled shutdown.
The company recommends organisations re-evaluate sustainability goals relating to CO2 emissions in light of future data centre requirements and power sources for the next few years.
When developing Gen AI applications, Gartner suggests that operators should focus on using a minimum amount of computing power and look at the viability of other options such as edge computing and smaller language models.
“The reality is that increased data centre use will lead to increased CO2 emissions to generate the needed power in the short-term,” Bob says. “This, in turn, will make it more difficult for data centre operators and their customers to meet aggressive sustainability goals relating to CO2 emissions.”



Hey Bravo,

Nice post !.......exactly the point Peter made when he was the acting Interim CEO, he produced at report regarding energy demand that was
going to be needed to "keep the lights on" at ALL Data Centre's, Server Farms and Ice Caves :ROFLMAO::ROFLMAO:

Peter promoted "Beneficial AI" and "Green AI" and while we still adhere to those titles, we added "Essential AI"...all part of the suggestive
sales tool to promote what Brainchip truly stands for, to make this planet a better place for ALL, a selfless company attempting to make
disruptive change for the better.

I'm now back in New Zealand for 6-8 months, or maybe longer, just see how things play out.

I have to share this, I've only been at my property for 2 days and this afternoon I run into a couple of guys wondering around the area that
I co own, I asked if I could help them with something, well as the conversation expanded I got around to my favourite subject, that being AI
and Brainchip, these guys were IT specialists checking an issue with the Chinese owners Wi-Fi, anyway, one of the guys knew about Brainchip!

I asked, are you a shareholder ? answer no...I asked, how did you hear about us then ?...I read about it on the internet.....THEN he tells me
his friend works for Mercedes Benz!...my ears prick up and I say, what, in Germany ? and he replies, no in Wellington, he is in the Corporate
Division.

I couldn't help myself, did he say anything about Brainchip?.....answer no, but did say they (Mercedes) are definitely working with AI with Lidar
etc....which we already knew....but what are the odds, been here 2 days, some random person I have never met before knows of Brainchip,
isn't a current shareholder and blurts out the words Mercedes Benz....time to visit the Casino in Auckland (joke) :ROFLMAO::ROFLMAO:

Regards....Tech (Karikari Peninsula) NZ (y):geek:
 
  • Like
  • Love
  • Fire
Reactions: 49 users

Diogenese

Top 20
Hey Bravo,

Nice post !.......exactly the point Peter made when he was the acting Interim CEO, he produced at report regarding energy demand that was
going to be needed to "keep the lights on" at ALL Data Centre's, Server Farms and Ice Caves :ROFLMAO::ROFLMAO:

Peter promoted "Beneficial AI" and "Green AI" and while we still adhere to those titles, we added "Essential AI"...all part of the suggestive
sales tool to promote what Brainchip truly stands for, to make this planet a better place for ALL, a selfless company attempting to make
disruptive change for the better.

I'm now back in New Zealand for 6-8 months, or maybe longer, just see how things play out.

I have to share this, I've only been at my property for 2 days and this afternoon I run into a couple of guys wondering around the area that
I co own, I asked if I could help them with something, well as the conversation expanded I got around to my favourite subject, that being AI
and Brainchip, these guys were IT specialists checking an issue with the Chinese owners Wi-Fi, anyway, one of the guys knew about Brainchip!

I asked, are you a shareholder ? answer no...I asked, how did you hear about us then ?...I read about it on the internet.....THEN he tells me
his friend works for Mercedes Benz!...my ears prick up and I say, what, in Germany ? and he replies, no in Wellington, he is in the Corporate
Division.

I couldn't help myself, did he say anything about Brainchip?.....answer no, but did say they (Mercedes) are definitely working with AI with Lidar
etc....which we already knew....but what are the odds, been here 2 days, some random person I have never met before knows of Brainchip,
isn't a current shareholder and blurts out the words Mercedes Benz....time to visit the Casino in Auckland (joke) :ROFLMAO::ROFLMAO:

Regards....Tech (Karikari Peninsula) NZ (y):geek:
Good place to be because Australia is about to be flooded by a tsunami of cash from BRN.
 
  • Haha
  • Like
  • Love
Reactions: 29 users

Tony Coles

Regular
https://www.embedded.com/top-5-reasons-why-cpu-is-the-best-processor-for-ai-inference/


Link above from ARM twitter X



29 Oct 2024 / 7:49 am

Top 5 Reasons why CPU is the Best Processor for AI Inference​

Top 5 Reasons why CPU is the Best Processor for AI Inference

1731481026864.jpg
Ronan Naughton

4 min read
0
Advanced artificial intelligence (AI), like generative AI, is enhancing all our smart devices. However, a common misconception is that these AI workloads can only be processed in the cloud and data center. In fact, the majority of AI inference workloads, which are cheaper and faster to run than training, can be processed at the edge – on the actual devices.
Partner Content
Real-Time Processing in High-Voltage Testing: Insights from HighVolt and Red Pitaya
Partner Content
20 Sep 2024
Real-Time Processing in High-Voltage Testing: Insights from HighVolt and Red Pitaya
By: Red Pitaya
GigaDevice Semiconductor expands its Arm MCU product roadmap through Arm Total Access
Partner Content
3 Sep 2024
GigaDevice Semiconductor expands its Arm MCU product roadmap through Arm Total Access
By: GigaDevice Semiconductor Inc.
IoT Innovations: Qualcomm debuts new technologies at Embedded World 2024
Partner Content
12 Jun 2024
IoT Innovations: Qualcomm debuts new technologies at Embedded World 2024
By: Qualcomm
The availability and growing AI capabilities of the CPU across today’s devices are helping to push more AI inference processing to the edge. While heterogeneous computing approaches provide the industry with the flexibility to use different computing components – including the CPU, GPU, and NPU – for different AI use cases and demands, AI inference in edge computing is where the CPU shines.
With this in mind, here are the top five reasons why the CPU is the best target for AI inference workloads.

ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING

The benefits of AI inference on the CPU​

Efficiency at the edge​

AI processing at the edge is important to the tech industry because the more AI processing at the edge, the more power is saved by not having to send data traveling to and from the cloud. This leads to significant energy and cost savings. The user also benefits from quicker, more responsive AI inference experiences, as well as greater privacy since data is processed locally. These are particularly important for power-constrained devices and edge applications, such as drones, smart wearables, and smart home devices, where power efficiency, latency, and security are paramount. In this context, the CPU plays a crucial role because it’s able to handle these AI inference tasks in the most efficient way possible.

Versatility for various AI inference tasks​

The CPU’s versatility allows it to handle a wide range of AI inference tasks, especially for applications and devices requiring quick responses and reliable performance. For example, real-time data processing tasks, like predictive maintenance, environmental monitoring, or autonomous navigation, are handled more efficiently and quickly on the CPU. In industrial IoT applications, this ensures that systems can respond to their environment, or any changes in its environment, in milliseconds. This is crucial for safety and functionality.

Great performance for smaller AI Models​

CPUs support a wide range of AI frameworks, like Meta’s PyTorch and ExecuTorch and Google AI Edge’s MediaPipe, making it easy to deploy large language models (LLMs) for AI inference. These LLMs are evolving at a rapid rate, with exceptional user experiences being unlocked by smaller compact models with an ever-decreasing number of parameters. The smaller the model, the more efficient and effective it runs on the CPU.
The availability of smaller LLMs, like the new Llama 3.2 1B and 3B releases, is critical to enabling AI inference at scale. Recently, Arm demonstrated that running the Llama 3.2 3B LLM on Arm-powered mobile devices through the Arm CPU-optimized kernels leads to a 5x improvement in prompt processing and a 3x improvement in token generation.
We are already seeing developers write more compact models to run on low-power processors and even microcontrollers, saving time and costs. Plumerai, which provides software solutions for accelerating neural networks on Arm Cortex-A and Cortex-M systems-on-chip (SoCs), runs just over 1MB of AI code on an Arm-based microcontroller that performs facial detection and recognition. Keen to preserve user privacy, all inference is done on the chip, so no facial features or other personal data are sent to the cloud for analysis.

Greater flexibility and programmability for developers​

The software community is actively choosing the CPU as the preferred path for targeting their AI workloads due to its flexibility and programmability. The greater flexibility of CPUs means developers can run a broader range of software in a greater variety of data formats without requiring developers to build multiple versions of their code. Meanwhile, every month there are new models with different architectures and quantization schemes emerging. As the CPU is highly programmable, these new models can be deployed on the CPU in a matter of hours.

The architecture foundation for AI innovation​

This developer innovation is built on the foundation of the CPU architecture, which continuously adds new features and instructions to process more advanced AI workloads. The ubiquity of the CPU means developers can then access these capabilities to accelerate and innovate AI-based experiences even further. In fact, the ongoing evolution of the CPU architecture has directly corresponded with the evolution of applications that are now faster and more intelligent.

Why CPUs for AI inference are indispensable​

CPUs are not just a component of system-on-chip (SoC) designs, they enable AI to be practical, efficient, and accessible across a wide variety of edge applications and devices. Offering a unique blend of efficiency, versatility, and accessibility, CPUs are indispensable for AI inference. They help reduce energy consumption and latency by processing AI tasks at the edge while delivering faster, more responsive AI experiences for the end user. As AI continues to evolve and permeate every aspect of technology, the role of CPUs in processing AI inference workloads will only grow, ensuring that AI can be deployed widely and sustainably across industries.
 
  • Like
  • Fire
  • Love
Reactions: 18 users

TECH

Regular
https://www.embedded.com/top-5-reasons-why-cpu-is-the-best-processor-for-ai-inference/


Link above from ARM twitter X



29 Oct 2024 / 7:49 am

Top 5 Reasons why CPU is the Best Processor for AI Inference​

Top 5 Reasons why CPU is the Best Processor for AI Inference

View attachment 72794
Ronan Naughton

4 min read
0
Advanced artificial intelligence (AI), like generative AI, is enhancing all our smart devices. However, a common misconception is that these AI workloads can only be processed in the cloud and data center. In fact, the majority of AI inference workloads, which are cheaper and faster to run than training, can be processed at the edge – on the actual devices.
Partner Content
Real-Time Processing in High-Voltage Testing: Insights from HighVolt and Red Pitaya
Partner Content
20 Sep 2024
Real-Time Processing in High-Voltage Testing: Insights from HighVolt and Red Pitaya
By: Red Pitaya
GigaDevice Semiconductor expands its Arm MCU product roadmap through Arm Total Access
Partner Content
3 Sep 2024
GigaDevice Semiconductor expands its Arm MCU product roadmap through Arm Total Access
By: GigaDevice Semiconductor Inc.
IoT Innovations: Qualcomm debuts new technologies at Embedded World 2024
Partner Content
12 Jun 2024
IoT Innovations: Qualcomm debuts new technologies at Embedded World 2024
By: Qualcomm
The availability and growing AI capabilities of the CPU across today’s devices are helping to push more AI inference processing to the edge. While heterogeneous computing approaches provide the industry with the flexibility to use different computing components – including the CPU, GPU, and NPU – for different AI use cases and demands, AI inference in edge computing is where the CPU shines.
With this in mind, here are the top five reasons why the CPU is the best target for AI inference workloads.

ADVERTISEMENT
SCROLL DOWN TO CONTINUE READING

The benefits of AI inference on the CPU​

Efficiency at the edge​

AI processing at the edge is important to the tech industry because the more AI processing at the edge, the more power is saved by not having to send data traveling to and from the cloud. This leads to significant energy and cost savings. The user also benefits from quicker, more responsive AI inference experiences, as well as greater privacy since data is processed locally. These are particularly important for power-constrained devices and edge applications, such as drones, smart wearables, and smart home devices, where power efficiency, latency, and security are paramount. In this context, the CPU plays a crucial role because it’s able to handle these AI inference tasks in the most efficient way possible.

Versatility for various AI inference tasks​

The CPU’s versatility allows it to handle a wide range of AI inference tasks, especially for applications and devices requiring quick responses and reliable performance. For example, real-time data processing tasks, like predictive maintenance, environmental monitoring, or autonomous navigation, are handled more efficiently and quickly on the CPU. In industrial IoT applications, this ensures that systems can respond to their environment, or any changes in its environment, in milliseconds. This is crucial for safety and functionality.

Great performance for smaller AI Models​

CPUs support a wide range of AI frameworks, like Meta’s PyTorch and ExecuTorch and Google AI Edge’s MediaPipe, making it easy to deploy large language models (LLMs) for AI inference. These LLMs are evolving at a rapid rate, with exceptional user experiences being unlocked by smaller compact models with an ever-decreasing number of parameters. The smaller the model, the more efficient and effective it runs on the CPU.
The availability of smaller LLMs, like the new Llama 3.2 1B and 3B releases, is critical to enabling AI inference at scale. Recently, Arm demonstrated that running the Llama 3.2 3B LLM on Arm-powered mobile devices through the Arm CPU-optimized kernels leads to a 5x improvement in prompt processing and a 3x improvement in token generation.
We are already seeing developers write more compact models to run on low-power processors and even microcontrollers, saving time and costs. Plumerai, which provides software solutions for accelerating neural networks on Arm Cortex-A and Cortex-M systems-on-chip (SoCs), runs just over 1MB of AI code on an Arm-based microcontroller that performs facial detection and recognition. Keen to preserve user privacy, all inference is done on the chip, so no facial features or other personal data are sent to the cloud for analysis.

Greater flexibility and programmability for developers​

The software community is actively choosing the CPU as the preferred path for targeting their AI workloads due to its flexibility and programmability. The greater flexibility of CPUs means developers can run a broader range of software in a greater variety of data formats without requiring developers to build multiple versions of their code. Meanwhile, every month there are new models with different architectures and quantization schemes emerging. As the CPU is highly programmable, these new models can be deployed on the CPU in a matter of hours.

The architecture foundation for AI innovation​

This developer innovation is built on the foundation of the CPU architecture, which continuously adds new features and instructions to process more advanced AI workloads. The ubiquity of the CPU means developers can then access these capabilities to accelerate and innovate AI-based experiences even further. In fact, the ongoing evolution of the CPU architecture has directly corresponded with the evolution of applications that are now faster and more intelligent.

Why CPUs for AI inference are indispensable​

CPUs are not just a component of system-on-chip (SoC) designs, they enable AI to be practical, efficient, and accessible across a wide variety of edge applications and devices. Offering a unique blend of efficiency, versatility, and accessibility, CPUs are indispensable for AI inference. They help reduce energy consumption and latency by processing AI tasks at the edge while delivering faster, more responsive AI experiences for the end user. As AI continues to evolve and permeate every aspect of technology, the role of CPUs in processing AI inference workloads will only grow, ensuring that AI can be deployed widely and sustainably across industries.

Hi Tony,

The "Real Ace" that Brainchip has up it's sleeve is Native SNN's which has to this point, never been brought to the surface.

The performance level increases dramatically from my understanding in all aspects, but we have to date, only ever engaged
with clients whom have used the CNN2SNN tool...but maybe I am behind the times, since TENN's raised it's beautiful head ?

We were always talking about battery operated devices (handheld) for the healthcare sector, which is an area I personally
think we will excel in, just as with the space industry, but Native SNN's was always the end game.

But as I say, maybe I'm not up to speed with what Tony and Sean have planned ?....where's Peter when I need to have a chat :ROFLMAO:

Kind regards....Tech

P.S. Thanks for your support all these years Tony (y)
 
  • Like
  • Fire
  • Love
Reactions: 31 users
This is truly intriguing:

Our partner Neurobus is now apparently partnered with Mercedes-Benz!!!
Of course that doesn’t necessarily mean we are involved in this, too, but in my eyes this is definitely the best news in recent weeks with regards to Mercedes-Benz… Keep in mind, though, that Neurobus is also partnered with Intel.


View attachment 72756


There is no info yet on the Neurobus website about this partnership with MB…

View attachment 72757

… but I found a Neurobus job ad (no longer active) for a Project Manager position that apparently mentioned that partnership, too…

View attachment 72758

View attachment 72759

… as well as this on the Neurobus website:

View attachment 72760

Plus this on LinkedIn:

View attachment 72761


I suspect we will find out more details soon…
…..and boom 💥 the detective dot joining continues, with 3 familiar Monika’s in the Partner column , go Frangi…..
 
  • Like
  • Fire
  • Love
Reactions: 13 users

Getupthere

Regular
  • Like
Reactions: 4 users

IloveLamp

Top 20
  • Like
  • Fire
Reactions: 16 users
Top Bottom