Neural networks are increasing in scale and sophistication, catalyzing the need for efficient hardware. An inevitability when transferring neural networks to hardware is that non-idealities impact performance. Hardware-aware training, where non-idealities are accounted for during training is one way to recover performance, but at the cost of generality. In this work, we demonstrate a binary neural network consisting of an array of 20,000 magnetic tunnel junctions (MTJ) integrated on complementary metal-oxide-semiconductor (CMOS) chips. With 36 dies, we show that even a few defects can degrade the performance of neural networks. We demonstrate hardware-aware training and show that performance recovers close to ideal networks. We then introduce a robust method – statistics-aware training – that compensates for defects regardless of their specific configuration. When evaluated on the MNIST dataset, statistics-aware solutions differ from software-baselines by only 2 %. We quantify the sensitivity of networks trained with statistics-aware and conventional methods and demonstrate that the statistics-aware solution shows less sensitivity to defects when sampling the network loss function.
Superparamagnetic tunnel junctions (SMTJs) and spin-torque nano-oscillators (STNOs) show promise for use in energy-efficient unconventional computing schemes based on stochastic information encodings, operating from nanosecond to microsecond time scales. We demonstrate electrical coupling of SMTJs for emulating neuro-synaptic connections and leverage the phase dynamics of STNOs for innovative approaches to unbiased random number generation, with the potential to mimic fast stochastic binary neurons, paving the way for low-energy, hardware-based stochastic neural networks.
Many probabilistic computing frameworks have been developed in recent years due to their potential as faster, energy-efficient alternatives to von Neumann computers for combinatorial optimization problems. In this work, we study the dynamics of a two-spin analog Ising computer implemented with superparamagnetic tunnel junctions (SMTJs). The operational-amplifier-based circuit features a polarity selection and a programmable gain parameter, allowing us to achieve both positive and negative coupling and perform simulated annealing if the gain is treated as inverse temperature. Experiments show that correlation between coupled SMTJs approaches 1 in the high-gain limit. Scaling of this design requires only trivial modifications to the circuit; however, scaling up to large networks of spins requires the development of SMTJs with enhanced properties, suggesting that a co-design approach between devices, architectures and algorithms is necessary.
Due to their interesting physical properties, myriad operational regimes, small size, and industrial fabrication maturity, magnetic tunnel junctions are uniquely suited for unlocking novel computing schemes for in-hardware neuromorphic computing. In this paper, we focus on the stochastic response of magnetic tunnel junctions, illustrating three different ways in which the probabilistic response of a device can be used to achieve useful neuromorphic computing power.
Magnetic tunnel junctions (MTJs) provide an attractive platform for implementing neural networks because of their simplicity, nonvolatility and scalability. In a hardware realization, however, device variations, write errors, and parasitic resistance will generally degrade performance. To quantify such effects, we perform experiments on a 2-layer perceptron constructed from a 15 × 15 passive array of MTJs, examining classification accuracy and write fidelity. Despite imperfections, we achieve accuracy of up to 95.3 % with proper tuning of network parameters. The success of this tuning process shows that new metrics are needed to characterize and optimize networks reproduced in mixed signal hardware.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.