Food4 thought
Regular
Linked in is pumping robots and A I like crazy this 1 half of the new financial year is going to be a cracker I think
Hopefully we see the fruit
Hopefully we see the fruit
Good chance it's independent A.I. of the neuromorphic kind, the Cloud is too slow and prone to interruptions.![]()
Teksun Inc on LinkedIn: #technology #teksun #forwardcollisionwarning #safetyfirst #smartdriving…
Introducing Teksun Inc's Forward Collision Warning system – your new co-pilot for safer journeys. Our advanced #technology detects potential collisions and…www.linkedin.com
Are we here ……….
View attachment 65845
Thank you......got drilled by zeebot for being computer dumbImagine your IoT devices had the intelligence of a human brain. This isn't just futuristic; it's achievable through neuromorphic computing! Neuromorphic computing mimics the behavior of brain neurons, offering human-like adaptability and learning capabilities. Why is this beneficial for IoT (Internet of Things)?
Currently, IoT devices mainly use the Von Neumann architecture, which processes data and commands sequentially in the same memory space. This method can lead to data bottlenecks and high power consumption, problematic for often battery-operated IoT devices 🪫.
Neuromorphic computing offers revolutionary advantages: · Parallel processing: Unlike sequential processing, neuromorphic systems can perform many tasks simultaneously. · Lower power consumption: Ideal for energy-efficient IoT devices. · Real-time processing: Ensures quick responses and low latency. · On-device adaptive learning: IoT devices can adapt and learn without constantly relying on central servers.
In what other areas could neuromorphic computing be useful? Share your thoughts in the comments.
#neuromorphiccomputing #AI #InternetofThings #VodafoneBusiness
(Sources: The Digital Speaker 2023, The Financial Express 2024)
Thank you"Imagine if your IoT devices had the intelligence of a human brain. It may sound like a dream of the future, but neuromorphic computing makes it possible! Neuromorphic computing mimics the behaviour of brain neurons and therefore has human-like adaptation and learning capabilities. But why could this be beneficial for the IoT (Internet of Things)? Up to now, IoT devices have mostly used the Neumann architecture, in which data and commands are processed sequentially in the same memory space. However, this method also leads to data bottlenecks and high power consumption, which is problematic for IoT devices that are often battery-operated 🪫. Neuromorphic computing offers revolutionary advantages here: - Parallel processing: unlike sequential processing, neuromorphic systems can perform many tasks simultaneously. - Lower power consumption: Ideal for IoT devices that rely on energy efficiency - Real-time processing: Ensures fast responses and low latency. - On-site adaptive learning: IoT devices can adapt and learn without constantly relying on centralised servers. In what other areas could neuromorphic computing become useful? Share your opinion in the comments. #neuromorphiccomputing #ki #internetofthings #vodafonebusiness (Sources: The Digital Speaker 2023, The Financial Express 2024)"
![]()
Since the following article is branded content by WIRED Consulting and QinetiQ (a UK-based Defense and Space Manufacturing company), chances are QinetiQ researchers have already been experimenting with neuromorphic technology - both Loihi 2 and Akida are explicitly referred to.
Also take a look at the illustration.
![]()
Taking inspiration from the human brain to create next-gen chips
As AI becomes the tech world’s center of gravity, the virtues of neuroscience-inspired computing systems are turning heads. These are now seen as a critical enabler for a raft of new innovations, not only on Earth, but in space…www.wired.com
BRANDED CONTENT BY
![]()
Neuromorphic Chips Could Solve Some Of AI’s Problems
As AI becomes the tech world’s center of gravity, the virtues of neuroscience-inspired computing systems are turning heads. These are now seen as a critical enabler for a raft of new innovations, not only on Earth, but in space…
![]()
WHAT IF WE told you there was a computer that can analyze and respond almost instantaneously to data from any of its thousands of sensors. That it can also match patterns in that data, recognizing faces, objects, and spoken words, and put that information into context, swiftly learning from what’s happening around it to predict what may happen next. Now, what if we told you it could do all that on around 20 watts of power, less than the average lightbulb—would that sound like a stretch?
As it happens, there are around eight billion of these miraculous devices operating in the world today. They’re called human brains, and they’re inspiring the relatively recent science of neuromorphic computing, a completely new approach to computer design in which elements of the system are modeled on the complex network of neurons and synapses that allow us to think and function.
“At the very basic level, neuromorphic computing is throwing out everything we think we know about computers and processors and looking at how biological brains compute,” says Mike Davies, Director of the Neuromorphic Computing Lab at Intel. “The guiding light is not just to achieve the levels of biological intelligence that we see in brains, but also the incredible efficiency and speed that we’re still very far from attaining with conventional technology.”
Traditional computers are based upon the classic Von Neumann architecture, in which data is constantly shuttled between a processing unit and a memory unit. This can create a bottleneck when large volumes of data are being processed. As a result, conventional computers are now approaching their limits, while also using mind-boggling quantities of energy to operate at those extremes. That’s been thrown into relief by the rise of “large language models” in AI, which require vast amounts of data and compute. It’s estimated that as much as 15 percent of the world’s total energy is now spent on some form of data manipulation, such as transmission or processing, and this figure is only likely to rise with the predicted millions of sensors that are necessary to enable a fully-fledged Internet of Things (IoT).
The neuromorphic approach offers a solution. It covers a range of different ways of mimicking neuroscience principles, and can apply to both hardware and software.
In Intel’s case, the company’s second-generation neuromorphic chip, Loihi 2, physically emulates the way that the brain processes data. Right now, as you read this sentence, your neurons are exchanging information with each other in a rush of electronic pulses (or “spikes”). The chip works in a similar way. It contains tens of thousands of silicon artificial neurons that also communicate through spiking electronic signals. This arrangement is known as a “spiking neural network” (SNN).
Whereas traditional chips incorporate a clock and work on the basis of continuously reading a rigid, sequential set of instructions, the neurons on the Loihi 2 chip work in parallel in an asynchronous way, and without any prescribed order. Like the neurons in our brain, its artificial neurons are event-triggered, and process information only after the receipt of an incoming activation signal.
A major benefit of this approach is that, as opposed to the always-on Von Neumann model, a spiking neuron network is effectively in “off” mode most of the time. Once triggered, it can then perform a huge number of parallel interactions.
“It’s exactly the same as the way the brain doesn’t churn every single feature of its incoming data,” says Jason Eshraghian, Assistant Professor at the University of California at Santa Cruz. “Imagine if you were to film a video of the space around you. You could be filming a blank wall, but the camera is still capturing pixels, whereas, as far as the brain is concerned, that’s nothing, so why process it?”
Because neuromorphic computing emulates the brain in this way, it can perform tasks using a fraction of the time and power needed by traditional machines.
Neuromorphic systems are also highly adaptable, because the connections between neurons can change in response to new tasks, making them well suited to AI. Analysts have therefore described it as a critical enabler of new technologies that could reach early majority adoption within five years.
Half a decade ago, Intel established a community of researchers around the world to explore the potential applications of neuromorphic computing in specific business use cases, ranging from voice and gesture recognition to image retrieval and robotic navigation. The results so far have been impressive, showing energy efficiency improvements of up to 1,000 times and speed increases of up to 100 times compared to traditional computer processors.
The potential of the neuromorphic approach in enabling less compute-hungry large language models for AI was recently demonstrated by Eshraghian and others at University of California Santa Cruz. Their “SpikeGPT” model for language generation, a piece of software that simulates an SNN through its algorithmic structures, uses approximately 30 times less computation than a similar model using typical deep learning methods.
“Large scale language models rely on ridiculous amounts of compute power,” he says. “Using spikes is a much more efficient way to represent information.”
Taking it to the edge
One of the major potential future benefits that comes from neuromorphic computing’s greater efficiency and speed is the capability to bring low-power, rapid decision-making to the increasing proliferation of devices that enable the Internet of Things. Think of autonomous vehicles, for instance. A neuromorphic chip negates the need to send signals over an internet connection for remote processing by powerful computers in the cloud. Instead, the device can carry out on-the-spot, AI-based learning in isolation—an approach known as “edge” computing.
“The dimension of remote adaptability and personalization that neuromorphic brings opens the door for all kinds of new capabilities with AI,” adds Davies, who believes the area of smart robots carrying out chores in the home, in particular, is one that’s ripe for development.
The term AIoT has been coined to describe the combination of AI and IoT, and California-based company BrainChip is already commercializing the concept with those new capabilities in mind. Its first-to-market digital neuromorphic processor, called Akida, is billed as “a complete neural processing engine for edge applications”.
Companies currently exploring the use of BrainChip’s technology include a leading car manufacturer that’s using it to boost the efficiency of in-car voice recognition, and a waste company that’s developing “smart bins” that can automatically sort and recycle different types of waste through a combination of AI-powered sensors and robotics—and wants to do it in the most efficient and eco-friendly way.
“We’re also working with space agencies to bring Akida into space, to be able to autonomously control machines on Mars, for instance,” says BrainChip CEO Sean Hehir. “When you have to run on solar power, you have to be very efficient. It also has to be completely autonomous, because there is no fast connection back to Earth. And don't forget that low power means low thermal emission—in space, you can't have a fan to cool something, because there's no air.”
Decentralizing AI from the cloud to a device also creates a desirable side effect: Greater privacy. “If you’re not moving data all around the world, you’re much more secure,” says Hehir. “It’s that simple.”
The Defense and National Security Perspective
Jeremy Baxter | Principal Systems Engineer for UAVs & Autonomy and QinetiQ Fellow
Any technology that can offer very fast reaction times or the ability to minimize power consumption will have real military benefits.
When combined with event-based sensing—which minimizes processing delays by reporting significant changes as soon as they occur—neuromorphic computing could allow us to create platforms that match the reaction times of birds and insects at a fraction of the weight and power of today’s technologies. Imagine a tiny, uncrewed aerial vehicle, for example, able to fly through woodland at high speeds yet avoid collisions.
There are two other application areas that could prove to be especially interesting. The first is defensive aid suites. These are military aircraft systems that offer protection from surface-to-air missiles, air-to-air missiles and guided anti-aircraft artillery where fast reaction times are crucial for survival.
The other is covert surveillance. Small, lower power devices are easier to conceal and last longer, playing to the strengths of neuromorphic processing.
Explore the other emerging innovation trends in the series…
DOWNLOAD THE FULL REPORT
- Mechanical human augmentation. Whether it’s additional limbs or smart exoskeletons, machinery is helping humans upgrade their natural capabilities.
- Power beaming. Sending power wirelessly over long distances could transform everything from electric vehicles to offshore wind farms.
- Biohybrid robots. Combining artificial and organic parts, biohybrid robots offer advantages such as self-repair and agility.
- Gene-editing and enhancement. Advances in biotech are spurring scientists to explore how genomes can be tweaked to make ecosystems more sustainable.
- Hyperspectral imaging. Hyperspectral cameras don’t merely record what something looks like, they can tell you what that thing is made from and help you see what the human eye cannot.
To find out more about QinetiQ, click here
To find out more about WIRED Consulting, click here
View attachment 65829
View attachment 65828
QinetiQ - Science and Technology Vision
SandT Science and Technology S&T Defence Innovationwww.qinetiq.com
Here is a link to QinetiQ’s full report on emerging innovation trends:
I got also a warning… I don’t get it whyThank you......got drilled by zeebot for being computer dumb
they are partnered with Renesas so there may be a tie with us there, but an IP licence is sounding pretty damn good, so I will run with your thoughts there especially after ready Tata Elxsi report with our mention in it.Price sensitive IP signing today? C'mon!
TATA?
Fixed it![]()
MosChip® on LinkedIn: #aum #future #highperformance #computing #collaboration #industry…
✨ Exciting News ✨ Centre for Development of Advanced Computing (C-DAC) Teams Up with MosChip® and Socionext Inc. to Develop Groundbreaking Arm-Based HPC…www.linkedin.com
Nice nice Anil ……
View attachment 65867
I can't remember that comment. He was excited and said the BOD wouldn't be happy if it took until 2026 but I can't recall a couple this year. Hope your right. Once we get 1 I think it could be a steady flow.Sean did say we should have a couple of deals this year in the bag at the AGM if I remember correctly
I can't remember that comment. He was excited and said the BOD wouldn't be happy if it took until 2026 but I can't recall a couple this year. Hope your right. Once we get 1 I think it could be a steady flow.
I don’t recall the Exact words ether however there was some reference to getting a couple of deals this year, he thought was possible .I can't remember that comment. He was excited and said the BOD wouldn't be happy if it took until 2026 but I can't recall a couple this year. Hope your right. Once we get 1 I think it could be a steady flow.