Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we present how to implement spintronic neurons with sigmoidal and rectified linear unit (ReLU)-like activation functions. We then perform a numerical experiment showing the reliability of neural networks made by spintronic neurons, all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a 'vanilla"neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87% in the test dataset, which is very close to 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are obtained with neurons having a ReLU-like activation function.
Reliability of Neural Networks Based on Spintronic Neurons
Eleonora RaimondoPrimo
;Anna GiordanoSecondo
;Andrea Grimaldi;Giovanni Finocchio
Ultimo
2021-01-01
Abstract
Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we present how to implement spintronic neurons with sigmoidal and rectified linear unit (ReLU)-like activation functions. We then perform a numerical experiment showing the reliability of neural networks made by spintronic neurons, all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a 'vanilla"neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87% in the test dataset, which is very close to 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are obtained with neurons having a ReLU-like activation function.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.