Compressed distributed gradient descent: Communication-efficient consensus over networks

Abstract

Network consensus optimization has received in- creasing attention in recent years and has found important appli- cations in many scientific and engineering fields. To solve network consensus optimization problems, one of the most well-known approaches is the distributed gradient descent method (DGD). However, in networks with slow communication rates, DGD’s performance is unsatisfactory for solving high-dimensional net- work consensus problems due to the communication bottleneck. This motivates us to design a communication-efficient DGD- type algorithm based on compressed information exchanges. Our contributions in this paper are three-fold: i) We develop a communication-efficient algorithm called amplified-differential compression DGD (ADC-DGD) and show that it converges under any unbiased compression operator; ii) We rigorously prove the convergence performances of ADC-DGD and show that they match with those of DGD without compression; iii) We reveal an interesting phase transition phenomenon in the convergence speed of ADC-DGD. Collectively, our findings advance the state- of-the-art of network consensus optimization theory.

Publication
In IEEE Conference on Computer Communications (2019)
Xin Zhang
Xin Zhang
Research Scientist

Hi there, welcome to my page!