Hosung Choi1,Woojong Yu1
Sungkyunkwan University1
Spiking neural networks (SNNs) are regarded as a more natural representation of artificial neural networks, distinguishing them from other types of networks. While several studies have been conducted to model SNNs, incorporating multiple connections between neurons has proven challenging due to the limited number of terminals in memristors. In this research, we propose a novel spiking neurosynaptic network that utilizes a multi-terminal floating-gate memristor.<br/>Our memristor, named TRAM (Tunneling Random Access Memory), leverages the change in the Fermi energy level (Ef) of graphene, exhibiting desirable memory characteristics such as a high on/off ratio (>10^5), excellent retention (>10,000 times), strong endurance (>100,000 times), and low energy consumption (120pJ). To emulate synapses and neurons, we adjusted the thickness of the insulating film layer (Al<sub>2</sub>O<sub>3</sub>). A thin layer (3nm) was used for neurons to demonstrate the Leaky Integrate-and-Fire (LIF) characteristic, while a thicker layer (7.5nm) was employed for synapses to exhibit Spike-timing-dependent Plasticity (STDP).<br/>When voltage inputs are applied to multiple synapses, all inputs are transmitted to the neurons. If the cumulative input from the synapses fails to surpass the threshold voltage of a neuron, the neuron slowly discharges its voltage in accordance with its leaky properties. However, if the threshold voltage is exceeded, the neuron fires and generates an output signal to update the weight (conductance) of the synapse. The threshold and feedback voltage of the neuron are created using a comparator.<br/>Through our multi-terminal spiking neurosynaptic network, we achieved a high learning accuracy of up to 83.08% on the unlabeled MNIST handwritten dataset.