4.6 Article

Distributed Networked Real-Time Learning

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TCNS.2020.3029992

Keywords

Asynchronous computing; distributed computing; networks; nonconvex optimization; real-time machine learning

Funding

  1. National Science Foundation [ECCS-1933878]
  2. Air Force Office of Scientific Research [15RT0767]

Ask authors/readers for more resources

This article explores the issue of learning models using a distributed architecture with interconnected local nodes when streaming data cannot be transferred to a single location in a timely manner. It proposes a distributed scheme where each local node implements stochastic gradient updates based on a local data stream, with a network regularization penalty used to maintain cohesion in the ensemble of models. Results show that the ensemble average approximates a stationary point and characterizes the differences between individual models and the ensemble average, highlighting the robustness of the proposed approach compared to federated learning in handling heterogeneity in data streams.
Many machine learning algorithms have been developed under the assumption that datasets are already available in batch form. Yet, in many application domains, data are only available sequentially overtime via compute nodes in different geographic locations. In this article, we consider the problem of learning a model when streaming data cannot be transferred to a single location in a timely fashion. In such cases, a distributed architecture for learning which relies on a network of interconnected local nodes is required. We propose a distributed scheme in which every local node implements stochastic gradient updates based upon a local data stream. To ensure robust estimation, a network regularization penalty is used to maintain a measure of cohesion in the ensemble of models. We show that the ensemble average approximates a stationary point and characterizes the degree to which individual models differ from the ensemble average. We compare the results with federated learning to conclude that the proposed approach is more robust to heterogeneity in data streams (data rates and estimation quality). We illustrate the results with an application to image classification with a deep learning model based upon convolutional neural networks.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available