DingoBorat
Slim
From 9.45min they talk about the new Kaist chip, still calling it "The World's First".
This video was posted just over 20min ago.
This video was posted just over 20min ago.
If someone had written that about me, I would have used theA Horrible and Dumb thing to say that makes me also feel like I don’t want to spend much time on this forum.
Like some of us including me said before getting absolutely bashed in the forum for simple facts.Interesting development.
If this is to be accurate, & our lead in the neuromorphic SNN space is correct ….?
Then we would be part of this ….?
If this is accurate & we are not part of it ,…
![]()
KAIST researchers develop world's first 'neuromorphic' AI chip
A research team at KAIST has developed the world’s first AI semiconductor capable of processing a large language model (LLM) with ultra-low power consumption, the Science Ministry said Wednesday.koreajoongangdaily.joins.comKAIST researchers develop world's first 'neuromorphic' AI chip
![]()
Neuromorphic computing technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks. [SHUTTERSTOCK]
A research team at KAIST has developed the world’s first AI semiconductor capable of processing a large language model (LLM) with ultra-low power consumption using neuromorphic computing technology.
The technology aims to develop integrated circuits mimicking the human nervous system so that chips could be able to perform more sophisticated tasks that require adaption and reasoning with far less energy consumption.
The Science Ministry said Wednesday that the team, led by Prof. Yoo Hoi-jun at the KAIST PIM Semiconductor Research Center, developed a “Complementary-Transformer” AI chip, which processes GPT-2 with an ultra-low power consumption of 400 milliwatts and a high speed of 0.4 seconds, according to the Ministry of Science and ICT.
![]()
A rendered image of comparing performance of different types of processors [YONHAP]
The 4.5-millimeter-square chip, developed using Korean tech giant Samsung Electronics' 28 nanometer process, has 625 times less power consumption compared with global AI chip giant Nvidia’s A-100 GPU, which requires 250 watts of power to process LLMs, the ministry explained.
The chip is also 41 times smaller in area than the Nvidia model, enabling it to be used on devices like mobile phones, therefore better protecting user privacy.
The KAIST team has succeeded in demonstrating various language processing tasks with its LLM accelerator on Samsung’s latest smartphone model, the Galaxy S24, which is the world’s first smartphone model with on-device AI, featuring real-time translation for phone calls and improved camera performance, Kim Sang-yeob, a researcher on the team, told reporters in a press briefing.
The ministry said the utilization of neuromorphic computing technology, which functions like a human brain, specifically spiking neural networks (SNNs), is essential to the achievement.
Previously, the technology was less accurate than deep neural networks (DNNs) and mainly capable of simple image classifications, but the research team succeeded in improving the accuracy of the technology to match that of DNNs to apply it to LLMs.
The team said its new AI chip optimizes computational energy consumption while maintaining accuracy by using unique neural network architecture that fuses DNNs and SNNs and effectively compresses the large parameters of LLMs.
“Neuromorphic computing is a technology global tech giants, like IBM and Intel, failed to realize. We believe we are the first to run an LLM with a ultra-low power neuromorphic accelerator,” Yoo said.
BY PARK EUN-JEE, YONHAP [park.eunjee@joongang.co.kr]
Well there was mention from memory that a "Korean company had dissected and experimented with the akida more than even brainchip... anyone remember that comment?Like some of us including me said before getting absolutely bashed in the forum for simple facts.
If we're not involved in the 2024 product roadmap of Samsung, then we're screwed.
No more lead for us. The biggest company that designs and uses its own chipsets does neuromorphic now.
Can't wait for the comments. Is someone gonna write a 3 pager about haunting and ridiculing me again, or will it just be reported?
Has RT liked it?Well there was mention from memory that a "Korean company had dissected and experimented with the akida more than even brainchip... anyone remember that comment?
I reckon this is awesome news....![]()
"Since Lou told us years ago that a South Korean company has spent possibly more time in validating the technology than BRN itself we have speculated a lot who it could have been."Has RT liked it?
Samsung, is certainly a big spicy cabbage, but some prefer fruit..Like some of us including me said before getting absolutely bashed in the forum for simple facts.
If we're not involved in the 2024 product roadmap of Samsung, then we're screwed.
No more lead for us. The biggest company that designs and uses its own chipsets does neuromorphic now.
Can't wait for the comments. Is someone gonna write a 3 pager about haunting and ridiculing me again, or will it just be reported?
It was peaceful here for a while, so why don’t you go back to where you belongLike some of us including me said before getting absolutely bashed in the forum for simple facts.
If we're not involved in the 2024 product roadmap of Samsung, then we're screwed.
No more lead for us. The biggest company that designs and uses its own chipsets does neuromorphic now.
Can't wait for the comments. Is someone gonna write a 3 pager about haunting and ridiculing me again, or will it just be reported?
Like some of us including me said before getting absolutely bashed in the forum for simple facts.
If we're not involved in the 2024 product roadmap of Samsung, then we're screwed.
No more lead for us. The biggest company that designs and uses its own chipsets does neuromorphic now.
Can't wait for the comments. Is someone gonna write a 3 pager about haunting and ridiculing me again, or will it just be reported?
Yep, great post @luvlifetravel . 2024 will be our year imo......is everybody ready?https://www.cnet.com/tech/mobile/on...-way-of-experiencing-artificial-intelligence/
An interesting 7 min read. We're definitely amongst the big players.
Probably even powering most of the up and coming AI features
On-Device AI Is a Whole New Way of Experiencing Artificial Intelligence
At MWC 2024, I saw firsthand how AI is fundamentally reshaping current and future devices, from phones to robots.
At Mobile World Congress last week, the show floor was abuzz with AI. It was the same at CES two months earlier: The biggest theme of the biggest consumer tech show was that AI suddenly seemed to be part of every single product. But the hype can make it hard to know what we should be excited about, what we should fear and what we should dismiss as a fad.
"Omnipresent ... but also overwhelming." That's how CCS Insight Chief Analyst Ben Wood described the MWC moment. "For many attendees, I felt it was rapidly reaching levels that risked causing AI fatigue."
But there was a positive side as well. Said Wood: "The most impressive demos were from companies showing the benefits AI could offer rather than just describing a service or a product as being AI-ready."
At last year's MWC, the popular generative AI tool ChatGPT was only around 3 months old, and on-device AI was mostly a twinkle in the eye of the tech companies present. This year, on-device was a reality, and attendees — like me — could experience it on the show floor.
I got to experience several demos featuring AI on devices, and the best of them brought artificial intelligence to life in ways I'd never seen before. In many cases, I could see that products we're already familiar with — from smartphones to cars — are getting a new lease on life thanks to AI, with some offerings using the technology in unique ways to set themselves apart from rivals. In other cases, new types of products, like AI-focused wearables and robots, are emerging that have the potential to displace what we know and love.
Above all, it was clear that on-device AI isn't a technology for tomorrow's world. It's available right here, right now. And it could impact your decision as to what piece of technology you buy next.
The age of AI phones has arrived
One of my biggest takeaways from MWC was that while all tech companies now have a raft of AI tools at their disposal, most are choosing to deploy them in different ways.
Take smartphones. Samsung has developed Gauss, its own large language model (the tech that underlies AI chatbots), to focus on translation on the Galaxy S24, whereas Honor uses AI to include eye tracking on its newly unveiled Magic 6 Pro — which I got to try out at its booth. Oppo and Xiaomi, meanwhile, both have on-device generative AI that they're applying to phone cameras and photo editing tools.
It goes to show that we're entering a new period of experimentation as tech companies figure out what AI can do, and crucially how it can improve our experience of using their products.
Samsung's Y.J. Kim, an executive vice president at the company and head of its language AI team, told reporters at an MWC roundtable that Samsung thought deeply about what sort of AI tools it wanted to deliver to users that would elevate the Galaxy S24 above the basic smartphone experience we've come to expect. "We have to make sure that customers will see some tangible benefits from their day-to-day use of the product or technologies that we develop," he said.
Conversely, there's also some crossover in AI tools between devices because of the partners these phone-makers share. As the maker of Android, the operating system used by almost all non-Apple phones, Google is experimenting heavily with AI features. These will be available across phones made by Samsung, Xiaomi, Oppo, Honor and a host of others.
Google used its presence at MWC this year to talk about some of its recently introduced AI features, like Circle to Search, a visual search tool that lets you draw a circle around something you see on screen to search for it.
The other, less visible partner that phone-makers have in common is chipmaker Qualcomm, whose chips were in an entire spectrum of devices at MWC this year. Its Snapdragon 8 Gen 3 chip, announced late in 2023, can be found in many of the phones that are now running on-device generative AI.
It's been only a year since Qualcomm first showed a basic demo of what generative AI on a phone might look like. Now phones packing this technology are on sale, said Ziad Asghar, who leads the company's AI product roadmap.
"From our perspective, we are the enablers," said Asghar. "Each and every one of our partners can choose to commercialize with unique experiences that they think are more important for their end consumer."
At MWC, the company launched its AI Hub, which gives developers access to 75 plug-and-play generative AI models that they can pick and choose from to apply to their products. That number will grow, and it means any company making devices with Qualcomm chips will be able to add all sorts of AI features.
As well as deciding which AI features to develop, one of the next big challenges phone-makers will have to tackle is how to get AI onto their cheaper devices. For now AI is primarily reserved for the top-end phones — the Galaxy S24s of the world — but over time this will change. There will be a trickle-down effect where this tech ends up on a wider range of a company's devices.
There will naturally be a difference in quality and speed between what the most expensive and the cheapest devices can do, said Asghar, as is currently the case with a phone's camera tech.
AI is changing how we interact with our devices
AI enhancements to our phones are all well and good, but already we're seeing artificial intelligence being used in ways that have the power to totally change how we interact with our devices — as well as potentially changing what devices we choose to own.
In addition to enabling companies to bring AI to their existing device lines, Qualcomm's tech is powering concept phones like the T Phone, created by Deutsche Telekom and Brain.AI. Together, these two have tapped Qualcomm's chipset to totally reimagine your phone's interface, creating an appless experience that responds to you based on your needs and the task you're trying to accomplish and generates, on the fly, whatever you see on screen as you go.
n the demo I saw at MWC, AI showed it has the potential to put an end to the days of constant app-swapping as you're trying to make a plan or complete a task. "It really changes the way we interface with devices and becomes a lot more natural," said Asghar.
But, he said, that's only the beginning. He'd like to see the same concept applied to mixed reality glasses. He sees the big benefit of the AI in allowing new inputs through gesture, voice and vision that don't necessarily rely on us tapping on a screen. "Technology is much more interesting when it's not really in your face, but it's solving the problems for you in an almost invisible manner," he said.
His words reminded me of a moment in the MWC keynote presentation when Google DeepMind CEO Demis Hassabis asked an important question. "In five-plus years time, is the phone even really going to be the perfect form factor?" said Hassabis. "There's all sorts of amazing things to be invented."
As we saw at CES with the Rabbit R1 and at MWC with the Humane AI Pin, these things are starting to become a reality. In my demo with the AI Pin — a wearable device with no screen that you interact with through voice and touch — it was clear to me that AI is creating space for experimentation. It's allowing us to ask what may succeed the phone as the dominant piece of technology in our lives.
It's also opening up new possibilities for tech that's been around awhile but for whatever reason hasn't quite struck a chord with consumers and found success outside of niche use cases.
Many of us have now played around with generative AI chatbots such as ChatGPT, and we're increasingly growing familiar with the idea of AI assistants. One company, Integrit from South Korea, brought a robot to the show that demonstrated how we may interact with these services in public settings, such as hotels or stores. Its AI and robotics platform, Stella AI, features a large, pebble-shaped display on a robotic arm that can swivel to address you directly.
Where this differs from previous robots I've encountered in customer service settings, such as the iconic Pepper, is that Stella is integrated with the latest AI models, including OpenAI's GPT-4 and Meta's Llama. This means it's capable of having sophisticated conversations with people in many different languages.
Rather than featuring a humanoid robot face like Pepper does, Stella uses generative AI to present a photorealistic human on its display. It's entirely possible that people will feel more comfortable interacting with a human, even one that isn't real, than a humanoid robot, but it feels very early to know this for sure.
What is clear is that this is just the beginning. This is the first generation of devices to really tap into the power of generative and interactive AI, and the floodgates are now well and truly open.
"I think we'll look back at MWC 2024 as being a foundational year for AI on connected devices," said Wood, the CCS Insight analyst. "All the pieces of the jigsaw are falling into place to enable developers to start innovating around AI to deliver new experiences which will make our interactions with smartphones and PCs more intuitive."
If this is the beginning, I'm intrigued to check back a year from now to see how AI continues to change our devices. Hype aside, there's a lot already happening to be excited about.
Editors' note: CNET is using an AI engine to help create some stories. For more, see this post.
Like some of us including me said before getting absolutely bashed in the forum for simple facts.
If we're not involved in the 2024 product roadmap of Samsung, then we're screwed.
No more lead for us. The biggest company that designs and uses its own chipsets does neuromorphic now.
Can't wait for the comments. Is someone gonna write a 3 pager about haunting and ridiculing me again, or will it just be reported?