Nice little acknowledgement of Akida innovation with IBM and Intel.
If AI use keeps exploding, cumulative energy usage could surpass 10^27 Joules by 2040 - more than what the world can generate.
www.techopedia.com
Why We Need to Re-Engineer AI to Work Like the Brain to Save on Energy
Dr. Tehseen Zia
Editor
Last updated: 17 August, 2023
NEUROMORPHIC COMPUTING: REDEFINING AI FOR A SUSTAINABLE FUTURE
Recent AI progress poses energy challenges. Balancing advancements with sustainability is crucial. Traditional computing faces limitations, while the brain's efficiency inspires Neuromorphic Computing (NC) for energy-efficient AI. Leading companies drive innovative NC technologies, reflecting the fast-growing demand for energy-saving AI.
As
artificial intelligence (AI) progresses, its accomplishments bring energy-intensive challenges.
One study predicts that if data growth persists, cumulative energy usage from binary operations could surpass
10^27 Joules by 2040 – more than what the world can generate.
So let’s explore AI’s environmental impact, the constraints of conventional computing models, and how
Neuromorphic Computing (NC) draws inspiration from the energy-efficient human brain, leading to sustainable AI advancements.
Artificial Intelligence: The Dilemma
In recent years, artificial intelligence (AI) has achieved remarkable landmarks, exemplified by the evolution of
language models like
ChatGPT and advancements in
computer vision that empower
autonomous technology and elevate medical imaging.
Moreover, AI’s astonishing proficiency in
reinforcement learning, as seen in its victories over human champions in games like Chess and Go,
highlights its remarkable capabilities.
While these developments have enabled AI to transform industries, foster business innovation, uncover scientific breakthroughs and make a strong impression on society,
they are not without consequences.
Alongside that alarming forecast for 2040, even today the storage of extensive data and training AI models using these datasets requires significant energy and computational resources, with research showing that:
Hence, it becomes vital to achieve a balance between advancements and energy requirements, considering their environmental effects, as AI continues to develop.
Von Neumann architecture: The Bottleneck
AI models operate within the framework of
Von Neumann architecture, a computer design that essentially separates processing and memory, requiring constant communication between them.
As AI models grow complex and datasets expand, this architecture faces significant hurdles.
Firstly, the processing and memory units shared a communication bus, slowing AI computations and hampering training speed.
Secondly, the processing unit of the architecture lacks parallel processing capabilities which impacts the training.
While GPUs alleviate the issue by allowing parallel processing, they introduce data transfer overhead.
The frequent data movement faces additional overhead due to memory hierarchy which impacts the performance.
Large datasets cause extended memory access times, and limited memory bandwidth, resulting in performance bottlenecks.
Complex AI models strain Von Neumann systems, limiting memory and processing capacities. These limitations have
given rise to high energy demands and carbon emissions in AI systems.
Addressing these challenges is crucial for optimizing AI performance and minimizing environmental impact.
Biological Brain: The Inspiration
The human brain is more powerful than any AI machines when it comes to cognitive abilities.
Despite its immense power, the brain is incredibly light and operates on just 10W of power, in contrast to the energy-hungry machines we use today.
According to an estimate, even this modest power budget allows the brain to achieve an astonishing
1 exaflop, equivalent to 1000 petaflops—a feat that the world’s fastest supercomputer with its 30 megawatts of power struggles to match at 200 petaflops.
The brain’s secret lies in its neurons, which integrate processing and memory, unlike the Von Neumann architecture.
The brain processes information in a massively parallel manner, with billions of neurons and trillions of synapses working simultaneously. Despite its remarkable intricacy, the brain remains compact and economical in its energy usage.
What is Neuromorphic Computing?
Neuromorphic computing (NC) is a branch of computing technology inspired by the structure and functioning of the human brain’s neural networks.
It seeks to design and develop computer architectures and systems that mimic the parallel and distributed processing capabilities of the brain, enabling efficient and energy-effective processing of complex tasks.
This approach aims to overcome the limitations posed by the Von Neumann architecture for AI tasks especially by co-locating memory and processing at single location.
To comprehend NC, it is vital to understand how the brain works. Neurons, the building blocks of brain, communicate via electrical signals for information processing.
Upon receiving signals from interconnected neurons, they process and emit impulses.
These impulses travel along pathways formed by neurons, with synapses – gaps between neurons – facilitating the transmission.
Within the framework of NC,
analog memristors are utilized to replicate the function of the synapses, achieving memory by adjusting resistance.
The rapid communication between neurons is typically achieved through the utilization of
Spiking Neural Networks (SNNs).
These SNNs link spiking neurons using artificial synaptic devices, such as memristors, which employ analog circuits to mimic brain-like electrical signals.
These analog circuits offer significantly higher energy efficiency compared to the conventional Von Neumann architecture.
Neuromorphic Technologies
The rise of AI is boosting the demand for neuromorphic computing.
The
global neuromorphic computing market is expected to grow from USD 31.2 million in 2021 to around USD 8,275.9 million by 2030, with an impressive CAGR of 85.73%. In response, companies are advancing neuromorphic technologies, such as:
•
IBM’s TrueNorth: Introduced in 2014, it’s a neuromorphic CMOS integrated circuit with 4096 cores, over a million neurons, and 268 million synapses. TrueNorth overcomes von Neumann bottlenecks, consuming only 70 milliwatts.
•
Intel’s Loihi: Unveiled in 2017, Loihi is 1000 times more energy-efficient than typical neural network training. It features 131,072 simulated neurons and shows energy efficiency 30-1000 times greater than CPUs/GPUs.
• BrainChip’s Akida NSoC: Using spiking neural network architecture, it integrates 1.2 million neurons and 10 billion synapses. Akida supports real-time, low-power AI applications like video object detection and speech recognition.
These innovations signal the rapid evolution of neuromorphic computing to meet AI demands.
Challenges of Neuromorphic Computing
Realizing the potential of NC in AI demands addressing specific challenges.
Firstly, the development of efficient algorithms compatible with neuromorphic hardware is crucial. This requires a deep understanding of hardware operations and tailored adaptations.
Secondly, the need to handle larger, intricate datasets is crucial. The present NC experiments involve relatively modest datasets, necessitating exploration of its performance with more substantial and complex problems.
As dataset size and complexity expand, NC’s computational demands increase.
The challenge lies in designing NC systems capable of meeting these demands while delivering precise and effective solutions.
Despite encouraging outcomes from smaller-scale tests, NC’s performance with larger and more intricate datasets remains untested.
Further research and development are essential to optimize the technology for practical applications.
The Bottom Line
Neuromorphic Computing (NC) draws inspiration from the brain’s neural networks to revolutionize AI with energy efficiency.
As AI advances bring environmental concerns, NC offers an alternative by mimicking the brain’s parallel processing.
Unlike the Von Neumann architecture, which hampers efficiency, NC co-locates memory and processing, overcoming bottlenecks.
Innovations like IBM’s TrueNorth, Intel’s Loihi, and
BrainChip’s Akida NSoC showcase the potential of neuromorphic technologies.
Challenges persist, including algorithm adaptation and scalability to larger datasets. As NC evolves, it promises energy-effective AI solutions with sustainable growth potential.