Publication
IJCNN 2018
Conference paper

Spiking Neural Networks Enable Two-Dimensional Neurons and Unsupervised Multi-Timescale Learning

View publication

Abstract

The capabilities of artificial neural networks (ANNs) are limited by the operations possible at their individual neurons and synapses. For instance, each neuron's activation only represents a single scalar variable. In addition, because neuronal activations may be dominated by a single timescale in the synaptic input, unsupervised learning from data with multiple timescales has not been generally possible. Here we address these by exploiting the continuous-time and asynchronous operation of spiking neural networks (SNNs), i.e. A biologically-inspired type of ANNs. First, we demonstrate how input neurons can be two-dimensional (2D), i.e. Each represent two variables. Second, we show unsupervised learning from multiple timescales simultaneously. 2D neurons operate by allocating each variable to a different timescale in their activation, i.e. One variable corresponds to the timing of individual spikes, and another to the spike rate. We show how these can be modulated separately but simultaneously, and we apply this mixed coding technique to encoding images with two modalities, namely, colour and brightness. Unsupervised multi-timescale learning is achieved by synapses with spike-timing-dependent plasticity, combined with varying degrees of short-term plasticity. We demonstrate the successful application of this learning scheme on the unsupervised classification of bimodal pictures encoded by our 2D neurons. Taken together, our results show that SNNs are capable of increasing both the information content of each neuron and the exploitable data in the input. We suggest that through these unique features, SNNs may increase the performance and broaden the applicability of ANNs.

Date

Publication

IJCNN 2018

Authors

Share