Presentation + Paper
15 March 2023 Circuits that train themselves: decentralized, physics-driven learning
Author Affiliations +
Proceedings Volume 12438, AI and Optical Data Sciences IV; 124380G (2023) https://doi.org/10.1117/12.2648618
Event: SPIE OPTO, 2023, San Francisco, California, United States
Abstract
In typical artificial neural networks, neurons adjust according to global calculations of a central processor, but in the brain neurons and synapses self-adjust based on local information. A man-made self-adjusting (distributed) system capable of performing machine-learning problems would have substantial scaling advantages over typical computational neural networks, in power consumption, speed, and robustness to damage. Furthermore, such a system would allow us to study physical learning without the added complexity of biology. Here we unveil the second-generation design of such a system – a transistor-based self-adjusting analog network that trains itself to perform a wide variety of tasks. Here we demonstrate basic features of the system, including the ability to monitor all internal states. This platform is already faster than a simulation of itself, and is thus an exciting platform for the investigation of physical learning.
Conference Presentation
© (2023) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Sam Dillavou, Benjamin Beyer, Menachem Stern, Marc Z. Miskin, Andrea J. Liu, and Douglas J. Durian "Circuits that train themselves: decentralized, physics-driven learning", Proc. SPIE 12438, AI and Optical Data Sciences IV, 124380G (15 March 2023); https://doi.org/10.1117/12.2648618
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Machine learning

Computing systems

Education and training

Analog electronics

Computer architecture

Distributed computing

Neurons

Back to Top