Low Sample and Communication Complexities in Decentralized Learning: A Triple Hybrid Approach

Abstract

Decentralized optimization has received increasing attention in recent years, due to its advantages in system robustness, data privacy and implementation simplicity. In this paper, we propose a decentralized hybrid stochastic gradient descent (DHSGD) algorithm for efficiently solving the nonconvex optimization problems in the decentralize fashion. We show that to reach an $epsilon^2$-stationary solution, the total sample complexity of our algorithm is $O(epsilon^{-3})$ and the communication complexity is $O(epsilon^{-3})$, which improves the $O(epsilon^{-4})$ sample and communication complexities of the existing decentralized stochastic gradient algorithms. We conduct extensive experiments with a variety of learning models to verify our theoretical findings. It is shown that our algorithm outperforms the existing methods when training on the large-scale datasets.

Publication
In IEEE Conference on Computer Communications 2021
Xin Zhang
Xin Zhang
Research Scientist

Hi there, welcome to my page!