Article pasted below Diana Deca’s comment roasting Loihi
View attachment 16939
View attachment 16940
MATRIX - The UTSA AI Consortium for Human Well-Being recently hosted a presentation by Yulia Sandamirskaya with Intel Corporation that discussed some of the…
www.linkedin.com
The article:
Brain-inspired chips promise ultra-efficient AI, so why aren’t they everywhere?
It's not because our AI overlords aren't keen on the idea
Tobias MannMon 12 Sep 2022 // 10:23 UTC
INTERVIEW Every time a chipmaker or researcher announces an advancement in neuromorphics, it's inevitably the same story: a brain-like AI chip capable of stupendous performance-per-watt compared to traditional accelerators.
Intuitively, the idea makes a lot of sense. Our brains are pretty good at making sense of the world, so why wouldn't a chip designed to work like them be good at it too?
Yet, after years of development and the backing of massive tech companies like IBM and Intel, these brain-like chips are still years away from making their way into consumer products.
That hasn't stopped the tech from grabbing headlines over the years. Neuromorphic chips up to 16 times
more efficient; brain-like chips potentially
poweringfuture supercomputers; Samsung wanting to
reverse engineer the brain; IBM
recreating a frog brain in silicon. You get the idea.
While the chips show promise, the reality is the field of neuromorphics is still in a very experimental stage, and faces many challenges that must be resolved before they are ready for prime time, explains Karl Freund, principal analyst at Cambrian AI Research, in an interview with
The Register.
This may be one of the reasons many of the more promising neuromorphic processors have seemingly stalled.
IBM, for example, hasn't given an update on its True North neuromorphic chips, which are capable of simulating more than a million neurons, in more than four years. SpiNNaker, another promising spiking neural networking processor, received an €8 million (c $8.15 million) grant in 2019 to develop a second-gen chip based on the design. However, the company behind the chip, Dresden, Germany-based SpiNNcloud, is only now getting off the ground.
Intel's Loihi and Loihi 2 processors have come the closest to a commercial launch in so far as Intel has made development boards available to outside researchers alongside its Lava software development kit.
The Department of Energy's Sandia National Laboratories, for example, is
exploring how these chips could be used to accelerate supercomputers. In a paper published in the journal Nature Electronics, researchers at Sandia demonstrated how Intel's Loihi chips "can solve more complex problems than those posed by artificial intelligence and may even earn a place in high-performance computing."
Yet, at least as of April, Intel has
no plans to productize its Loihi chips anytime soon.
What's the holdup?
So what gives? Why is it these chips, which show such promise in the lab, haven't matured faster given the insatiable demand for AI/ML?
According to Freund, one of the biggest problems comes down to funding.
"I tried to connect some venture capitalists in both neuromorphic and analog [computing] and a fairly consistent response even before the current capital crunch was 'we don't invest in research'," he says. "Their take is pretty much the same as mine, which is most of the technologies, perhaps all, are still in the research phase."
As a result, progress in productizing neuromorphic computing has been limited to large companies with deep R&D budgets, he said.
But it's not just funding that's getting in the way. Freund argues the scope of the problem for neuromorphics has only gotten larger as the tech has grown more mature.
With the first neuromorphic test chips, scientists were primarily focused on getting to a point where they could do useful work, he explains.
However, productizing such a chip means solving other problems, like how you get data in and out of the chip effectively.
This isn't a problem unique to neuromorphics by any means. It's one associated with quantum computing and even traditional accelerators, which have accumulated bottlenecks in recent generations due to the speed at which the data can be pre-and post-processed and/or ingested and egressed from the chip, Freund explained.
Finally, there's the issue of developing software that can take advantage of these accelerators.
"It's really going to take a whole community of researchers to solve the programmability problem of neuromorphic computing," Freund says.
Traditional accelerators are good enough
Perhaps the biggest reason that neuromorphic computers haven't taken over is that traditional accelerators are simply getting more powerful and more efficient quickly enough.
"What they're finding is that platforms, like Nvidia Jetson Orin or some new novel platforms from startups, are solving the problem really quickly. So the need to do something super exotic is continuing to lessen as the state of the art in existing technologies evolves," Freund says. "If you look at what Qualcomm has done with their AI engine, you're talking milliwatts… and what it does when you take a photograph is astounding."
As a result, meaningful problems can be resolved in the power envelope required by existing digital technologies.
While neuromorphics may not be ready to replace traditional accelerators anytime soon, Freund believes the technology will eventually reach the mainstream.
"These things do take time to mature," he says, citing the rise of Arm processors in the datacenter as something that took more than 10 years to achieve. "And that was for CPUs; CPUs are easy compared to things like quantum and neuromorphic computing." ®