Lior Horesh, Andrew Conn, et al.
NAOIII 2014
A distributed randomized block coordinate descent method for minimizing a convex function of a huge number of variables is proposed. The complexity of the method is analyzed under the assumption that the smooth part of the objective function is partially block separable. The number of iterations required is bounded by a function of the error and the degree of separability, which extends the results in Richtárik and Takác (Parallel Coordinate Descent Methods for Big Data Optimization, Mathematical Programming, DOI:10.1007/s10107-015-0901-6) to a distributed environment. Several approaches to the distribution and synchronization of the computation across a cluster of multi-core computer are described and promising computational results are provided.
Lior Horesh, Andrew Conn, et al.
NAOIII 2014
Lam Nguyen, Katya Scheinberg, et al.
Optimization Methods and Software
Jakub Marecek, Robert Shorten, et al.
ICBDSC 2016
Jakub Marecek, Peter Richtárik, et al.
EJOR