About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
VLSI Circuits 2019
Conference paper
Energy-efficient continual learning in hybrid supervised-unsupervised neural networks with PCM synapses
Abstract
Artificial neural networks (ANNs) can outperform the human ability of object recognition by supervised training of synaptic parameters with large datasets. Contrarily to the human brain, however, ANNs cannot continually learn, i.e. acquire new information without catastrophically forgetting previous knowledge. To solve this issue, we present a novel hybrid neural network based on CMOS logic and phase change memory (PCM) synapses, mixing a supervised convolutional neural network (CNN) with bio-inspired unsupervised learning and neuronal redundancy. We demonstrate high classification accuracy in MNIST and CIFAR10 datasets (98% and 85%, respectively) and energy-efficient continual learning of up to 30% of non-trained classes with 83% average accuracy.