Can the next generation of AI run at 20 watts?
Scientists are targeting neuromorphic computing
Faced with the AI energy crisis, scientists are looking for breakthroughs in neuromorphic computing technology, aiming to make AI run as efficiently as the human brain. Researchers simulated the structure of the human brain and built a supercomputer that consumes only 10 kilowatts of power, occupies two square meters, but is 250,000 to 1 million times faster than a biological brain. Neuromorphic computing can achieve efficient and energy-saving operation by simulating the structure of the brain's neural network, using event-driven communication and memory computing.
The latest progress is led by the US National Laboratory. Scientists are trying to bring science fiction into reality: building a supercomputer with a space of only two square meters and the number of neurons comparable to the human cerebral cortex.
Even more amazing is that calculations show that this neuromorphic computer could run
250,000 to 1 million times faster than a biological brain , while consuming only
10 kilowatts of power (only slightly more than the energy consumption of a household air conditioner), which is undoubtedly a shot in the arm for the current dilemma of AI development.
Artificial intelligence is currently facing an "energy crisis". With the explosive development of technologies such as large language models, its astonishing power consumption has become a heavy burden that cannot be ignored.
Projections suggest that by 2027, the electricity bill alone to run these models could be as much as $25 trillion—more than the U.S. GDP that year.
In comparison, the most powerful intelligent being in nature, the human brain, consumes only about
20 watts per day , which is equivalent to the power of a household LED light bulb.
Scientists can't help but wonder:
Can AI be as efficient as the human brain ?
面对AI能源危机,科学家瞄准神经形态计算技术寻求突破,旨在让AI像人脑一样高效运行。研究人员模拟人脑结构,构建仅需10千瓦功耗、占地两平方米但速度比生物大脑快25-100万倍的超级计算机。神经形态计算通过模拟大脑神经网络结构,采用事件驱动型通信和内存计算,能实现高效节能运行。
wallstreetcn.com
Can the next generation of AI run at 20 watts? Scientists are targeting neuromorphic computing
Quantum bits
08:06
Faced with the AI energy crisis, scientists are looking for breakthroughs in neuromorphic computing technology, aiming to make AI run as efficiently as the human brain. Researchers simulated the structure of the human brain and built a supercomputer that consumes only 10 kilowatts of power, occupies two square meters, but is 250,000 to 1 million times faster than a biological brain. Neuromorphic computing can achieve efficient and energy-saving operation by simulating the structure of the brain's neural network, using event-driven communication and memory computing.
"Westworld" is really coming! Scientists are trying to equip AI with human brains.
The latest progress is led by the US National Laboratory. Scientists are trying to bring science fiction into reality: building a supercomputer with a space of only two square meters and the number of neurons comparable to the human cerebral cortex.
Even more amazing is that calculations show that this neuromorphic computer could run
250,000 to 1 million times faster than a biological brain , while consuming only
10 kilowatts of power (only slightly more than the energy consumption of a household air conditioner), which is undoubtedly a shot in the arm for the current dilemma of AI development.
Artificial intelligence is currently facing an "energy crisis". With the explosive development of technologies such as large language models, its astonishing power consumption has become a heavy burden that cannot be ignored.
Projections suggest that by 2027, the electricity bill alone to run these models could be as much as $25 trillion—more than the U.S. GDP that year.
In comparison, the most powerful intelligent being in nature, the human brain, consumes only about
20 watts per day , which is equivalent to the power of a household LED light bulb. Scientists can't help but wonder:
Can AI be as efficient as the human brain ?
The answer:
neuromorphic computing .
This cutting-edge technology, which aims to simulate the structure and operation of the human brain, is being seen as a key direction for the next generation of AI. One of its core goals is to drive powerful intelligence with "light bulb-level" energy consumption.
Neuromorphic computing: learning from the brain
In the human brain, there are about 86 billion complex neurons working together and building a huge signal transmission network through 100 trillion synapses.
Inspired by their structure and functionality, neuromorphic computing is built using energy-efficient electronic and photonic networks that mimic biological neural networks, namely
spiking neural networks (SNNs), and aims to integrate memory, processing, and learning into a unified design.
Its main features include:
Event-driven communication:
Only necessary circuits are activated during peak and event-driven conditions, thus reducing power consumption.
In-memory computing:
Data processing occurs at the storage location to reduce transmission delays.
Adaptability:
The system learns and evolves on its own over time without the need for centralized updates.
Scalability:
The architecture of neuromorphic systems allows for easy scalability to accommodate more extensive and complex networks without significantly increasing resource requirements.
Unlike current AI models that rely on binary supercomputer processing, it can dynamically adjust based on its understanding of the world, making it smarter, more flexible, and less susceptible to interference.
For example, when a tester wearing a T-shirt with a stop sign walked in front of an autonomous car, the car controlled by traditional AI stopped because it was unable to discern the context.
In contrast, a neuromorphic computer processes information through feedback loops and context-driven checks, which can clearly determine that the stop sign is on the T-shirt and allow the car to continue driving.
This difference is not surprising. After all, neuromorphic computing simulates the most efficient and powerful reasoning and prediction engine in nature. Scientists therefore believe that the next wave of artificial intelligence technology explosion must be a combination of physics and neuroscience.
Prospects for a new round of technological revolution
At present, relevant research is in full swing. Existing neuromorphic computers have more than 1 billion neurons connected by more than 100 billion synapses. Although it is only a drop in the bucket compared to the complexity of the human brain, it also reasonably proves that this technology can achieve brain-level expansion.
Jeff Shainline of the National Institute of Standards and Technology said:
Once we can implement the complete process of creating a network in a commercial foundry, we can quickly scale up to very large systems, and if we can make one neuron, it will be fairly easy to make a million neurons.
Technology companies such as IBM and Intel are at the forefront of this technological revolution.
The TrueNorth chip developed by IBM in 2014 and the Loihi chip launched by Intel in 2018 are both hardware products designed to simulate the neural activity of the brain, paving the way for subsequent new AI models.
In addition, some startups focusing on neuromorphic computing are also beginning to emerge. For example, BrainChip has launched
the Akida neuromorphic processor , which is designed for low-power but powerful edge AI and can be widely used in always-online smart homes, factories or city sensors.
At the same time, according to The Business Research Company, by 2025, the global neuromorphic computing market size will grow exponentially to
US$1.81 billion , with a compound annual growth rate of
25.7% .
In the longer term, scientists hope that neuromorphic computing will transcend the traditional boundaries of artificial intelligence and get closer to human intelligent reasoning patterns, bringing new technological breakthroughs to the next generation of intelligent systems and even AGI.