4.6 Article

Cross-Silo Federated Learning for Multi-Tier Networks with Vertical and Horizontal Data Partitioning

Publisher

ASSOC COMPUTING MACHINERY
DOI: 10.1145/3543433

Keywords

Coordinate descent; federated learning; machine learning; stochastic gradient descent

Funding

  1. Rensselaer-IBM AI Research Collaboration
  2. IBM AI Horizons Network
  3. National Science Foundation [CNS 1553340, CNS 1816307]

Ask authors/readers for more resources

This study applies federated learning to tiered communication networks and proposes a communication-efficient decentralized training algorithm for two-tiered networks. The algorithm is validated through theoretical analysis and empirical experiments.
We consider federated learning in tiered communication networks. Our network model consists of a set of silos, each holding a vertical partition of the data. Each silo contains a hub and a set of clients, with the silo's vertical data shard partitioned horizontally across its clients. We propose Tiered Decentralized Coordinate Descent (TDCD), a communication-efficient decentralized training algorithm for such two-tiered networks. The clients in each silo perform multiple local gradient steps before sharing updates with their hub to reduce communication overhead. Each hub adjusts its coordinates by averaging its workers' updates, and then hubs exchange intermediate updates with one another. We present a theoretical analysis of our algorithm and show the dependence of the convergence rate on the number of vertical partitions and the number of local updates. We further validate our approach empirically via simulation-based experiments using a variety of datasets and objectives.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available