Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we present how to implement spintronic neurons with sigmoidal and rectified linear unit (ReLU)-like activation functions. We then perform a numerical experiment showing the reliability of neural networks made by spintronic neurons, all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a 'vanilla"neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87% in the test dataset, which is very close to 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are obtained with neurons having a ReLU-like activation function.

Reliability of Neural Networks Based on Spintronic Neurons

Eleonora Raimondo
Primo
;
Anna Giordano
Secondo
;
Andrea Grimaldi;Giovanni Finocchio
Ultimo
2021-01-01

Abstract

Spintronic technology is emerging as a direction for the hardware implementation of neurons and synapses of neuromorphic architectures. In particular, a single spintronic device can be used to implement the nonlinear activation function of neurons. Here, we present how to implement spintronic neurons with sigmoidal and rectified linear unit (ReLU)-like activation functions. We then perform a numerical experiment showing the reliability of neural networks made by spintronic neurons, all having different activation functions to emulate device-to-device variations in a possible hardware implementation of the network. Therefore, we consider a 'vanilla"neural network implemented to recognize the categories of the Mixed National Institute of Standards and Technology database, and we show an average accuracy of 98.87% in the test dataset, which is very close to 98.89% as obtained for the ideal case (all neurons have the same sigmoid activation function). Similar results are obtained with neurons having a ReLU-like activation function.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11570/3240367
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? 8
social impact