Details zur Publikation

Kategorie Textpublikation
Referenztyp Zeitschriften
DOI 10.1109/TAI.2025.3620774
Titel (primär) Asynchronous Federated Learning with non-convex client objective functions and heterogeneous dataset
Autor Forootani, A.; Iervolino, R.
Quelle IEEE Transactions on Artificial Intelligence
Erscheinungsjahr 2025
Department BIOENERGIE
Sprache englisch
Topic T5 Future Landscapes
Keywords Federated Learning; Stochastic Gradient De- scent; Client Drifts; Asynchronous Federated Learning
Abstract Federated Learning is a distributed machine learning paradigm that enables model training across decentralized devices holding local data, thereby preserving data privacy and reducing the need for centralization. Despite its advantages, traditional FL faces challenges such as communication overhead, system heterogeneity, and straggler effects. Asynchronous Federated Learning has emerged as a promising solution, allowing clients to send updates independently, which mitigates synchronization issues and enhances scalability. This paper extends the Asynchronous Federated Learning framework to scenarios involving clients with non-convex objective functions and heterogeneous dataset, which are prevalent in modern machine learning models like deep neural networks. We provide a rigorous convergence analysis for this setting, deriving bounds on the expected gradient norm and examining the impacts of staleness, variance, and heterogeneity. To address the challenges posed by asynchronous updates, we introduce a staleness-aware aggregation mechanism that penalizes outdated updates, ensuring fresher data has a more significant influence on the global model. Additionally, we propose a dynamic learning rate schedule that adapts to client staleness and heterogeneity, improving stability and convergence.

Our approach effectively manages heterogeneous environments, accommodating differences in client computational capabilities, data distributions, and communication delays, making it suitable for real-world Federated Learning applications. We also analyze the effects of client selection methods—specifically, choosing clients with or without replacement—on variance and model convergence, providing insights for more effective sampling strategies. The practical implementation of our methods using PyTorch and Python’s asyncio library demonstrates their applicability in real-world asynchronous and heterogeneous FL scenarios. Empirical experiments validate the proposed methods, showing improved performance and scalability in handling asynchronous updates, and non-convex client’s objective function with associated heterogeneous dataset.



dauerhafte UFZ-Verlinkung https://www.ufz.de/index.php?en=20939&ufzPublicationIdentifier=31461
Forootani, A., Iervolino, R. (2025):
Asynchronous Federated Learning with non-convex client objective functions and heterogeneous dataset
IEEE Transactions on Artificial Intelligence 10.1109/TAI.2025.3620774