TY - GEN
T1 - Continual Local Updates for Federated Learning with Enhanced Robustness to Link Noise
AU - Lari, Ehsan
AU - Gogineni, Vinay Chakravarthi
AU - Arablouei, Reza
AU - Werner, Stefan
N1 - Publisher Copyright:
© 2023 IEEE.
PY - 2023
Y1 - 2023
N2 - Communication errors caused by noisy links can negatively impact the accuracy of federated learning (FL) algorithms. To address this issue, we introduce an FL algorithm that is robust to communication errors while concurrently reducing the communication load on clients. To formulate the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We recast this problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve the reformulated problem via the alternating direction method of multipliers. Unlike conventional FL approaches employing random scheduling, the proposed algorithm grants the clients the ability to continually update their local model estimates even when they are not selected by the server to participate in FL. This allows for more frequent and ongoing client involvement, resulting in performance improvement and enhanced robustness to communication errors compared to when the local updates are only performed when the respective clients are selected by the server. We demonstrate the effectiveness and performance gains of the proposed algorithm through simulations.
AB - Communication errors caused by noisy links can negatively impact the accuracy of federated learning (FL) algorithms. To address this issue, we introduce an FL algorithm that is robust to communication errors while concurrently reducing the communication load on clients. To formulate the proposed algorithm, we consider a weighted least-squares regression problem as a motivating example. We recast this problem as a distributed optimization problem over a federated network, which employs random scheduling to enhance communication efficiency, and solve the reformulated problem via the alternating direction method of multipliers. Unlike conventional FL approaches employing random scheduling, the proposed algorithm grants the clients the ability to continually update their local model estimates even when they are not selected by the server to participate in FL. This allows for more frequent and ongoing client involvement, resulting in performance improvement and enhanced robustness to communication errors compared to when the local updates are only performed when the respective clients are selected by the server. We demonstrate the effectiveness and performance gains of the proposed algorithm through simulations.
U2 - 10.1109/APSIPAASC58517.2023.10317446
DO - 10.1109/APSIPAASC58517.2023.10317446
M3 - Article in proceedings
AN - SCOPUS:85180012380
T3 - Asia Pacific Signal and Information Processing Association Annual Summit and Conference Proceedings
SP - 1199
EP - 1203
BT - 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference (APSIPA ASC)
PB - IEEE
T2 - 2023 Asia Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2023
Y2 - 31 October 2023 through 3 November 2023
ER -