Academy & Industry Research Collaboration Center (AIRCC)

Volume 11, Number 13, August 2021

Divide-and-Conquer Federated Learning Under Data Heterogeneity

  Authors

Pravin Chandran, Raghavendra Bhat, Avinash Chakravarthy and Srikanth Chandar, Intel Technology India Pvt. Ltd, Bengaluru, India

  Abstract

Federated Learning allows training of data stored in distributed devices without the need for centralizing training-data, thereby maintaining data-privacy. Addressing the ability to handle data heterogeneity (non-identical and independent distribution or non-IID) is a key enabler for the wider deployment of Federated Learning. In this paper, we propose a novel Divide-andConquer training methodology that enables the use of the popular FedAvg aggregation algorithm by over-coming the acknowledged FedAvg limitations in non-IID environments. We propose a novel use of Cosine-distance based Weight Divergence metric to determine the exact point where a Deep Learning network can be divided into class-agnostic initial layers and class-specific deep layers for performing a Divide and Conquer training. We show that the methodology achieves trained-model accuracy at-par with (and in certain cases exceeding) the numbers achieved by state-of-the-art algorithms like FedProx, FedMA, etc. Also, we show that this methodology leads to compute and/or bandwidth optimizations under certain documented conditions.

  Keywords

Federated Learning, Divide and Conquer, Weight divergence.