Interesting patent coming out of Technion. Doesn't reference Haick but talks about neurons and unsupervised STDP. I don't recall seeing Technion previously looking into this sort of work.
Pure speculation, DYOR
appft.uspto.gov
United States Patent Application | 20220058492 |
Kind Code | A1 |
DANIAL; Loai ; et al. | February 24, 2022 |
DELTA-SIGMA MODULATION NEURONS FOR HIGH-PRECISION TRAINING OF MEMRISTIVE SYNAPSES IN DEEP NEURAL NETWORKS
Abstract
A neural network comprising: a plurality of interconnected neural network elements, each comprising: a neuron circuit comprising a delta-sigma modulator, and at least one synapse device comprising a memristor connected to an output of said neuron circuit; wherein an adjustable synaptic weighting of said at least one synapse device is set based on said output of said neuron circuit
Inventors: | DANIAL; Loai; (Nazareth, IL) ; KVATINSKY; Shahar; (Hanaton, IL) |
Applicant: |
Name | City | State | Country | Type |
---|
TECHNION RESEARCH & DEVELOPMENT FOUNDATION LIMITED |
Haifa | |
IL
| |
|
---|
Family ID: | 1000005999340 |
Appl. No.: | 17/299102 |
Filed: | December 4, 2019 |
PCT Filed: | December 4, 2019 |
PCT NO: | PCT/IL2019/051328 |
371 Date: | June 2, 2021 |
Related U.S. Patent Documents
| | | | |
---|
| Application Number | Filing Date | Patent Number | |
---|
| 62774933 | Dec 4, 2018 | | |
| | | | |
Current U.S. Class: | 1/1 |
Current CPC Class: | G06N 3/063 20130101; G06N 3/049 20130101; G06N 3/088 20130101 |
International Class: | G06N 3/08 20060101 G06N003/08; G06N 3/04 20060101 G06N003/04; G06N 3/063 20060101 G06N003/063 |
Claims
1. A neural network comprising: a plurality of interconnected neural network elements, each comprising: a neuron circuit comprising a delta-sigma modulator, and at least one synapse device comprising a memristor connected to an output of said neuron circuit; wherein an adjustable synaptic weighting of said at least one synapse device is set based on said output of said neuron circuit.
2. The neural network of claim 1, wherein said plurality of interconnected neural elements form a trainable single-layer neural network, arranged as a memristive crossbar array comprising a synaptic weightings matrix.
3. (canceled)
4. The neural network of claim 2, wherein an output vector of said neural network is calculated as a weighted sum of said outputs of said neuron circuits multiplied by said synaptic weightings matrix.
5. The neural network of claim 4, further comprising an output circuit comprising at least one delta-sigma modulator, wherein said output circuit encodes said output vector.
6. The neural network of claim 1, wherein, at a training stage, said neural network is trained by an iterative process comprising: (i) inputting analog inputs into said neuron circuits of said neural network; (ii) calculating an output vector as a weighted sum of said outputs of said neuron circuits, based on a said synaptic weightings matrix; and (iii) comparing said output vector to a training dataset input, wherein said comparing leads to an adjustment of said synaptic weightings matrix.
7. The neural network of claim 6, wherein said adjustment minimizes a cost function based on a gradient descent algorithm using said delta-sigma modulators as an activation function.
8. The neural network of claim 6, wherein said iterative process continues until said output vector corresponds to said training dataset input.
9. The neural network of claim 6, wherein said training dataset input is an output of a delta-sigma modulator.
10. The neural network of claim 2, wherein said neural network comprises two or more of said single-layer neural networks arranged as a multi-layer neural network.
11. The neural network of claim 1, further comprising a plurality of input neuron circuits, a plurality of synapse devices, and at least one output neuron circuit, wherein, at a training stage, said neural network is trained by an
unsupervised spike-time-dependent plasticity (STDP) process, wherein outputs of said neuron circuits reflect spikes encoded in time.
12. The neural network of claim 11, wherein said
STDP process comprises comparing pre-synaptic and post-synaptic outputs of said neuron circuits, wherein a difference detected in said comparison leads to long-term potentiation or long-term depression.
13. A method comprising: providing a neural network comprising a plurality of interconnected neural network elements, each of said neural network elements comprising: a neuron circuit comprising a delta-sigma modulator, and at least one synapse device comprising a memristor connected to an output of said neuron circuit, wherein an adjustable synaptic weighting of said at least one synapse device is set based on said output of said neuron circuit; and at a training stage, training said neural network by an iterative process comprising: (i) inputting analog inputs into said neuron circuits, (ii) calculating an output vector of said neural network as a weighted sum of said outputs of said neuron circuits, based on a said synaptic weightings, and (iii) comparing said output vector to a training dataset input, wherein said comparing leads to an adjustment of said synaptic weightings.
14. The method of claim 13, wherein said plurality of interconnected neural elements form a trainable single-layer neural network, arranged as a memristive crossbar array comprising a synaptic weightings matrix.
15. (canceled)
16. The method of claim 14, wherein said output vector is calculated as a weighted sum of said outputs of said neuron circuits multiplied by said synaptic weightings matrix.
17. The method of claim 16, wherein said neural network further comprises an output circuit comprising at least one delta-sigma modulator, wherein said output circuit encodes said output vector.
18-30. (canceled)
31. A method comprising: providing a memristor driver circuit representing a trainable neural network circuit, wherein said memristor driver circuit comprises: a delta-sigma modulator configured to receive an input voltage and output a binary sequence representing an amplitude of said input signal; a memristive device; and at least one subtractor; training said memristor driver circuit by an iterative process comprising: (i) a read stage wherein said input voltage is a read voltage selected to produce a desired duty cycle of said delta-sigma modulator, and (ii) an update stage wherein said input voltage is an updating voltage reflecting a subtraction operation between a reference voltage and an output signal of said memristive device.
32. The method of claim 31, wherein said memristor driver circuit comprises a plurality of interconnected said memristor driver circuits arranged as a trainable single-layer neural network, arranged as a memristive crossbar array comprising a synaptic weightings matrix.
33. (canceled)
34. The method of claim 31, wherein said read stage reflects a feedforward operation of the neural network, and said update stage reflects an error backpropagation operation of the neural network.
35. The method of claim 31, wherein said iterative process minimizes a cost function based on a gradient descent algorithm using said delta-sigma modulators as an activation function.
36. The method of claim 31, wherein said memristor driver circuit further comprises at least one operational amplifier configured to amplify said output signal of said memristive device.