Translator Disclaimer
16 September 2019 Semi-supervised learning and inference in domain-wall magnetic tunnel junction (DW-MTJ) neural networks
Author Affiliations +
Abstract
Advances in machine intelligence have sparked interest in hardware accelerators to implement these algorithms, yet embedded electronics have stringent power, area budgets, and speed requirements that may limit non- volatile memory (NVM) integration. In this context, the development of fast nanomagnetic neural networks using minimal training data is attractive. Here, we extend an inference-only proposal using the intrinsic physics of domain-wall MTJ (DW-MTJ) neurons for online learning to implement fully unsupervised pattern recognition operation, using winner-take-all networks that contain either random or plastic synapses (weights). Meanwhile, a read-out layer trains in a supervised fashion. We find our proposed design can approach state-of-the-art success on the task relative to competing memristive neural network proposals, while eliminating much of the area and energy overhead that would typically be required to build the neuronal layers with CMOS devices.
Conference Presentation
© (2019) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Christopher H. Bennett, Naimul Hassan, Xuan Hu, Jean Anne C. Incornvia, Joseph S. Friedman, and Matthew J. Marinella "Semi-supervised learning and inference in domain-wall magnetic tunnel junction (DW-MTJ) neural networks", Proc. SPIE 11090, Spintronics XII, 110903I (16 September 2019); https://doi.org/10.1117/12.2530308
PROCEEDINGS
7 PAGES + PRESENTATION

SHARE
Advertisement
Advertisement
Back to Top