Energy aware algorithmic engineering
Abstract
In this work, we argue that energy management should be a guiding principle for design and implementation of algorithms. Traditional complexity models for algorithms are simple and do not aid in design of energy-efficient algorithms. In this work, we conducted a large number of experiments to understand energy consumption for algorithms. We study the energy consumption for popular vector operations, matrix operations, sorting, and graph algorithms. We observed that the energy consumption for any given algorithm depends on the memory parallelism the algorithm can exhibit for a given data layout in the RAM with variations up to 100% for many popular algorithms. Our experiments validate the asymptotic energy complexity model presented in a companion paper [1] and brings out many practical insights. We show that reads can be more expensive in terms of energy than writes, and different data types can lead to different energy consumption. Our most important result is a theoretical and experimental quantification of the impact of parallel data sequences on energy consumption. We also observe that high memory parallelism can also increase energy consumption with multiple concurrent access sequences. We use insights from our experiments to propose algorithmic engineering techniques for practical energy efficient software.