Private and communication-efficient edge learning: a sparse differential gaussian-masking distributed SGD approach

Abstract

With rise of machine learning (ML) and the proliferation of smart mobile devices, recent years have witnessed a surge of interest in performing ML in wireless edge networks. In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network computing. Toward this end, we propose a new decentralized stochastic gradient method with sparse differential Gaussian-masked stochastic gradients (SDM-DSGD) for non-convex distributed edge learning. Our main contributions are three-fold: i) We propose a generalized differential-coded DSGD update, which enable a much lower transmit probability for gradient sparsification, and provide the convergence rate; ii) We theoretically establish the privacy and communication efficiency performance guarantee for our SDM-DSGD method, which outperforms all existing works; and iii) We reveal theoretical insights and offer practical design guidelines for the interactions between privacy preservation and communication efficiency, two conflicting performance goals. We conduct extensive experiments with a variety of learning models on MNIST and CIFAR-10 datasets to verify our theoretical findings. Collectively, our results contribute to the theory and algorithm design for distributed edge learning.

Publication
In Proceedings of the Twenty-First International Symposium on Theory, Algorithmic Foundations, and Protocol Design for Mobile Networks and Mobile Computing
Xin Zhang
Xin Zhang
Research Scientist

Hi there, welcome to my page!