About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
IEEE TBioCAS
Conference paper
Spiking Optical Flow for Event-Based Sensors Using IBM's TrueNorth Neurosynaptic System
Abstract
This paper describes a fully spike-based neural network for optical flow estimation from dynamic vision sensor data. A low power embedded implementation of the method, which combines the asynchronous time-based image sensor with IBM's TrueNorth Neurosynaptic System, is presented. The sensor generates spikes with submillisecond resolution in response to scene illumination changes. These spike are processed by a spiking neural network running on TrueNorth with a 1-ms resolution to accurately determine the order and time difference of spikes from neighbouring pixels, and therefore infer the velocity. The spiking neural network is a variant of the Barlow Levick method for optical flow estimation. The system is evaluated on two recordings for which ground truth motion is available, and achieves an average endpoint error of 11% at an estimated power budget of under 80 mW for the sensor and computation.