About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
INFORMS 2021
Talk
Graph Toplogy Invariant Gradient and Sampling Complexity for Decentralized and Stochastic Optimization
Abstract
One fundamental problem in constrained decentralized multi-agent optimization is the trade-off between gradient/sampling complexity and communication complexity. In this paper we propose new algorithms whose gradient and sampling complexities are graph topology invariant, while their communication complexities remain optimal. All the aforementioned gradient and sampling complexities match the lower complexity bounds for centralized convex smooth optimization and are independent of the network structure. To the best of our knowledge, these gradient and sampling complexities have not been obtained before in the literature of decentralized optimization over a constraint feasible set.