No concrete link, but Cambridge Consultants are definitely right up our alley…
Below article confirms the above. What I found particularly intriguing was the paragraph on parametric insurance, which I have marked blue. An application for neuromorphic technology that doesn’t immediately spring to mind, at least not to mine…
Cambridge Consultants (not to be mixed up with now defunct Cambridge Analytica and their data scandal!) are part of Capgemini, a French multinational IT and consulting company, headquartered in Paris, that I personally suspect to be an investor hidden behind one of the nominee accounts, most likely BNP Paribas. Just my gut feeling.
A gentle reminder: Capgemini’s Head of Center of Excellence, Silicon Engineering, Loïc Hamon, moderated a panel at the Munich GSA Executive Forum in mid-June titled “New Silicon Design Paradigms“ with Sean Hehir being one of the panelists, discussing the effects of the “chiplet revolution”. Capgemini were also sponsor and VIP dinner host at said event.
EventReimagining the future of the semiconductor industryat GSA European Executive ForumJune 17-19, 2024 | Munich, Germany FacebookTwitterLinkedin
www.capgemini.com
Learn how neuromorphic computing could transform how many enterprises approach their large scale compute needs in the future.
www.silicon.co.uk
The High Performance, Low Power Promise of Neuromorphic Computing
September 20, 2023, 12:17 pm
Society, and in turn business, need to find more sustainable alternatives to the power-hungry computation that has got us to where we are today. I believe it is time to turn for inspiration to the most efficient and powerful computer of all – the human brain. Excitingly, the emergence of neuromorphic computing, which mimics our neural systems, promises both extraordinary performance and transformative energy efficiency.
Before I turn to its wider benefits and potential applications, let me put that energy saving in perspective. Conventional computing technology is based on the so-called von Neumann architecture, where data processing and transfer are carried out intensively and continuously. Next-generation computers are expected to operate at the exascale with 1018 calculations per second. But the downside is power consumption.
Data computation and transfer are responsible for a large part of this consumption, and the rapid development of machine learning and AI neural network models is adding even more demand. As much as 10 megawatts of power could be used for some AI learning algorithms on an exascale computer.
Data-centric computing requires a hardware system revolution. The performance of the computing system, in particular the energy efficiency, sets the fundamental limit of AI/ML capability. As for neuromorphic computing? It has the potential to achieve HPC and yet consumes 1/1000th of the energy.
The neuromorphic approach uses silicon artificial neurons to form a spiking neural network (SNN) that performs event-triggered computation. There is a key difference between an SNN and other networks, such as the convolutional neural network (CNN). An SNN is formed by silicon artificial neurons and performs event-triggered computation. Spiking neurons process input information only after the receipt of the incoming spike signal. Spiking neural networks effectively attempt to make the neurons more like real neurons.
The process does not work in discrete time steps. Instead, it takes in events over a time series to help build up signals within the neurons. These signals accumulate inside the neurons until a threshold is passed, at which point it triggers computation operation.
Ultra-low power operation can be achieved thanks to SNNs being effectively in an ‘off’ mode most of the time and only kicking into action when a change, or ‘event’, is detected.
Once in action, it can achieve fast computation without running an energy-consuming fast clock by triggering a huge number of parallel operations (equivalent to 1000s CPU in parallel). So, it consumes only a fraction of the power compared to CPU/GPU for the same workload.
This is why the future of neuromorphic computing is well suited to edge AI – implementing low-power AI on end devices without connecting to the cloud. This is especially so for TinyML applications that tend to focus on battery-operated sensors, IoT devices and so on.
Next-generation neuromorphic systems are expected to have intrinsic capabilities to learn or deal with complex data just as our brain does. It has the potential to process large amounts of digital information with much lower power consumption than conventional processors.
In the medium term, hybrid traditional computers with neuromorphic chips could vastly improve performance over conventional machines
. In the longer term, fully neuromorphic computers will be fundamentally different and designed for specific applications, from natural language processing to autonomous driving.
When it comes to design, instead of the conventional architecture of portioning chips into processor and memory, the computer may be built with silicon ‘neurons’ performing both functions.
Building extensive ‘many-to-many’ neuron connectivity will allow an efficient pipeline for signal interaction and facilitate massive parallel operation. There is a trend to develop ever-increasing amounts of electronic neurons, synapses and so on in a single chip.
The design approaches of neuromorphic processor chips broadly follow one of a number of distinct paths. The ASIC-based digital neuromorphic chip offers highly optimised computation performance tailored for application requirements. For AI applications, it can potentially perform both inference and real-time learning.
The FPGA-based chip is similar to ASIC-based digital design but also offers portability and reconfigurability. Due to its highly reconfigurable nature and parallel speed, FPGA is considered to be a suitable platform for mimicking, to some degree, the natural plasticity of biological neural networks.
Analogue neuromorphic chips, which include so-called ‘in-memory-computing’, have the potential to achieve the lowest power consumption. They’d mainly be suited for machine learning inference rather than real-time learning.
The photonic integrated circuit (PIC) based neuromorphic chip offers photonic computation that can achieve very high speed at very low power consumption, while mixed-signal NSoC (Neuromorphic System-on-Chip) design combines extremely low power analogue design for ML inference with digital SNN architecture processor for real-time learning.
I expect that neuromorphic computing will generate development opportunities in several technological areas, such as materials, devices, neuromorphic circuits and new neuromorphic algorithms and software development platforms – all crucial elements for the success of neuromorphic computing.
There are countless potential applications. Applying neuromorphic techniques to vision applications represents a large market opportunity for many different sectors, including smart vision sensors and gesture control applications in smart homes, offices and factories.
Another use case is neuromorphic computing for myoelectric prosthetics control. Myoelectric prosthetics assist people with reduced mobility by sensing and processing muscle spikes. However, inefficiencies must be improved for enhanced user experience, such as increasing the granularity of movement classification and reducing computational resources to decrease energy consumption.
Low-power edge computing represents a key area of high commercial potential. As IoT applications in smart homes, offices, industries and cities proliferate, there is an increasing need for more intelligence on the edge as control is moved from data centres to local devices. Applications such as autonomous robots, wearable healthcare systems, security and IoT all share the common characteristics of battery-operated, ultra-low power, standalone operation.
One potential application that I find particularly fascinating is that of “Parametric Insurance”. With global attention increasingly turning to climate-related issues, this unconventional form of ‘disaster insurance’ is playing an increasingly significant role. It is a product that offers pre-specified pay-outs based on a trigger event – and can help to provide protection when standard policies are harder to get.
For me, the correlation to neuromorphic computing is clear. Parametric Insurance can be tied to a catastrophe bond (CAT) for events such as hurricanes, earthquakes and so on. Neuromorphic powered edge computing has a big role to play as it would allow for very granular and sophisticated risk analysis, adjudication, and payment settlement. All would be at the edge – with an associated low cost.
About the author
Dr Aidong Xu, Head of Semiconductor Capability, Cambridge Consultants
Aidong holds over 30 years of experience across diverse industries, including with some of the leading semiconductor companies. He has managed large internationally based engineering teams and brought innovative industry-leading products into the global market that have achieved rapid and sustained business growth. Aidong holds a PhD. in Power electronics and Power semiconductors.