·¬ÇÑÉçÇø

Event Details

Central Acceleration for Federated Optimization with Stochastic and Deterministic Client Selection

Presenter: Lei Zhao
Supervisor:

Date: Fri, August 25, 2023
Time: 10:00:00 - 00:00:00
Place: ECS 467

ABSTRACT

Abstract:

Federated learning (FL) is a promising technique for training shared and optimized models using distributed private datasets. However, FL algorithms often face challenges in terms of convergence rate and communication costs. To address these challenges, we propose two methods: adaptive central acceleration with stochastic client selection and Nesterov's accelerated federation with deterministic client selection. These methods aim to accelerate the training process while ensuring data privacy, as the central update relies solely on global information. The one with the stochastic client selection method works well when client availability is uncertain, while the deterministic selection one is great for situations with reliable and continuous client access. The experimental results demonstrate the effectiveness of our proposed methods. Notably, our approaches not only achieve faster convergence compared to the baseline, but also significantly reduce the communication costs. Moreover, these methods lead to a reduction in overall computing resources required for local training. Experimental results show that the proposed solution can achieve three-time faster convergence than the benchmark, and a comparable accuracy using 1% and 5% of clients only. By striking a balance between convergence speed, model accuracy, data privacy, and resource efficiency, our proposed methods contribute to the advancement of FL techniques.