4.7 Article

Multi-task peer-to-peer learning using an encoder-only transformer model

Related references

Note: Only part of the references are listed.
Article Computer Science, Artificial Intelligence

Peer-to-peer deep learning with non-IID data

Robert Sajina et al.

Summary: This paper proposes a decentralized learning variant of the P2P gossip averaging method with Batch Normalization (BN) adaptation for P2P architectures. The approach effectively improves the learning performance and accuracy of models in decentralized systems by mitigating the non-IID data characteristics.

EXPERT SYSTEMS WITH APPLICATIONS (2023)

Article Computer Science, Theory & Methods

Multi-Task Federated Learning for Personalised Deep Neural Networks in Edge Computing

Jed Mills et al.

Summary: Federated Learning (FL) is an emerging approach that allows collaborative training of Deep Neural Networks (DNNs) on mobile devices without the need for data leaving the devices. We propose a Multi-Task Federated Learning (MTFL) algorithm that introduces non-federated Batch-Normalization (BN) layers to improve individual user model accuracy (UA) and convergence speed. Experimental results demonstrate significant reduction in the number of rounds required to reach the target UA using our algorithm on MNIST and CIFAR10 datasets.

IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS (2022)

Article Computer Science, Theory & Methods

A state-of-the-art survey on solving non-IID data in Federated Learning

Xiaodong Ma et al.

Summary: This article analyzes the problems of non-IID data in federated learning and presents a series of challenges. Through the classification and research of existing methods, it is found that non-IID data not only reduces the performance of FL models but also affects users' active participation. Compared with methods based on data sharing, improving federated learning algorithms is a common practice to solve this problem.

FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE (2022)

Article Computer Science, Hardware & Architecture

Toward Resource-Efficient Federated Learning in Mobile Edge Computing

Rong Yu et al.

Summary: This article introduces the application of federated learning in mobile edge computing and resource optimization methods. The study found that black-box and white-box methods are the two main categories of resource-efficient techniques. Black-box methods include training tricks, client selection, data compensation, and hierarchical aggregation, while white-box methods include model compression, knowledge distillation, feature fusion, and asynchronous update.

IEEE NETWORK (2021)

Article Computer Science, Artificial Intelligence

Personalized Multitask Learning for Predicting Tomorrow's Mood, Stress, and Health

Sara Taylor et al.

IEEE TRANSACTIONS ON AFFECTIVE COMPUTING (2020)

Article Computer Science, Artificial Intelligence

Distributed optimization for deep learning with gossip exchange

Michael Blot et al.

NEUROCOMPUTING (2019)