This started out as something else, but it's turned out to be about Rain Ai again.
It's easy to understand the allure of analog, as, in theory, analog is a direct "analog" of the neuronal system. Hence much of the early attempts at electronic neurons and academic research has focused on analog. Some small 1-bit analog systems may have achieved tantalizingly passable results, but the accumulating error limits their development.
Analog neurons work by summing the currents through controllable impedances called ReRAMS, The currents are applied to a summing impedance so the voltage across that impedance is proportional to the sum of the ReRAM currents. Measuring that voltage is supposed to provide a count of the number of ReRAMs through which current has flowed into the summing impedance. The problem is that the ReRAM impedances and the summing impedances are not uniform but suffer from defects such as manufacturing variability.
One notable failure of an attempt to even more closely mimic the wetware neuron using analog is Rain AI, which initially sought to reproduce the complexity of neuronal connexions by the use of insulated nanowires with random connections.
https://www.eetimes.com/rain-neuromorphics-tapes-out-demo-chip-for-analog-ai/
https://www.eetimes.com/rain-demonstrates-ai-training-on-analog-chip/
Neurons can have 10,000 connexions. The human brain's configuration, the neuronal connexions, are determined by DNA and sensory experience. For example, the autonomous activations of breathing and heart beat are pre-programmed into the brain as it develops. The sensory wiring is in part preprogrammed. This is determined by genetic code, not by randomness.
The Rain inventors hoped that the system could be trained to recognize coherent data by feeding large amounts of activations into the system until it learned something, thereby inventing infinite-shot learning.
Because the brain can learn instantaneously, it is clear that new physical connexions cannot be grown instantaneously, so existing connexions must be involved in learning. This suggests that neurons include dormant connexions, some of which are activated when new data is learned. This is imitated by the "weights" of the artificial neuron. In an electronic brain, the weights can be electronically switched on and off, allowing the neuron to be repurposed when the neural network is programmed with a new model, whereas, in a natural brain, the group of activated weight synapses must be retained to store the data in memory. Assuming that neurons can be involved in remembering more than one item, this suggests that different groups of synapses connected to a neuron are activated when different data is being recognized. So a programmed synapse has a permanent ON memory function and a switchable ON/OFF control responsive to activations. The activation data determines which group of synapses and their weights are switched on.
An interesting thing is that the brain has the capability to stimulate memories without external activation.
At the beginning nof 2022, Rain still planned to go ahead with the analog random nanowire synapses:
"R
ain’s vision is a fully analog, asynchronous, ultra-low power, tileable, scalable chip with capacity for 100 billion parameters that can imitate the human brain. While this work used a crossbar memristor array, Rain’s hardware roadmap still includes migration to randomly connected ReRAM cells as the technology matures."
Rain Demonstrates AI Training on Analog Chip - EE Times
Remember that, in 2019, Sam Altman agreed to buy $M51 of Rain's chips off the drawing board:
OpenAI Agreed to Buy $51 Million of AI Chips From a Startup Backed by CEO Sam Altman | WIRED
Documents show that OpenAI signed a letter of intent to spend $51 million on brain-inspired chips developed by startup Rain. OpenAI CEO Sam Altman previously made a personal investment in Rain.
www.wired.com
O
penAI in 2019 signed a nonbinding agreement to spend $51 million on the chips when they became available, according to a copy of the deal and Rain disclosures to investors this year, seen by WIRED. Rain told investors that Altman had personally invested more than $1 million in the company. The letter of intent has not been previously reported.
Having spent, no doubt, many hours unsuccessfully attempting to achieve the Frankenstein moment, Rain now boasts the virtues of its yet-to-see-daylight digital NN:
https://rain.ai/products ,
https://rain.ai/
"
We're building the most efficient hardware for AI."
https://rain.ai/products
"
We offer IP licensing opportunities for Rain’s digital in-memory compute tile and software stack across a range of compute use cases.
The IP is tailored to on-device AI workloads requiring ultra-low latency and high energy efficiency."
and licences in interconnection IP from Arteris,
https://ir.arteris.com/news-release...teris-selected-rain-ai-use-next-generation-ai
Arteris Selected by Rain AI for Use in the Next Generation of AI
Jan 30, 2024 at 8:45 AM EST
PDF Version
Optimizing on-chip mesh connectivity with Arteris’ FlexNoC 5 physically aware network-on-chip enables Rain AI to realize faster data transfers at ultra-low power to achieve record performance for Generative AI and Edge AI computing at scale
It seems their much-vaunted first generation analog random nanowire experiment sank without trace and it's all hands to the pumps in the digital lifeboat.