SignsGD via zeroth-order oracle
Sijia Liu, Pin-Yu Chen, et al.
ICLR 2019
Distributed learning has become a critical enabler of the massively connected world that many people envision. This article discusses four key elements of scalable distributed processing and real-Time intelligence: problems, data, communication, and computation. Our aim is to provide a unique perspective of how these elements should work together in an effective and coherent manner. In particular, we selectively review recent techniques developed for optimizing nonconvex models (i.e., problem classes) that process batch and streaming data (data types) across networks in a distributed manner (communication and computation paradigm). We describe the intuitions and connections behind a core set of popular distributed algorithms, emphasizing how to balance computation and communication costs. Practical issues and future research directions will also be discussed.
Sijia Liu, Pin-Yu Chen, et al.
ICLR 2019
Xiangyi Chen, Sijia Liu, et al.
ICLR 2019
Kaidi Xu, Hongge Chen, et al.
IJCAI 2019
Songtao Lu, Jason Lee, et al.
IEEE TSP