2022
Conference article  Open Access

Decentralized federated learning and network topologies: an empirical study on convergence

Kavalionak H., Carlini E., Dazzi P., Ferrucci L., Mordacchini M., Coppola M.

Federated Learning  Distributed Systems  Peer-to-peer 

Federated Learning is a well-known learning paradigm that allows the distributed training of machine learning models. Federated Learning keeps data in the source devices and communicates only the model's coefficients to a centralized server. This paper studies the decentralized flavor of Federated Learning. A peer-to-peer network replaces the centralized server, and nodes exchange model's coefficients directly. In particular, we look for empirical evidence on the effect of different network topologies and communication parameters on the convergence in the training of distributed models. Our observations suggest that small-world networks converge faster for small amounts of nodes, while xx are more suitable for larger setups.

Source: SEBD 2022 - 30th Italian Symposium on Advanced Database Systems, pp. 317–324, Tirrenia, Pisa, Italy, 19-22/06/2022



Back to previous page
BibTeX entry
@inproceedings{oai:it.cnr:prodotti:471869,
	title = {Decentralized federated learning and network topologies: an empirical study on convergence},
	author = {Kavalionak H. and Carlini E. and Dazzi P. and Ferrucci L. and Mordacchini M. and Coppola M.},
	booktitle = {SEBD 2022 - 30th Italian Symposium on Advanced Database Systems, pp. 317–324, Tirrenia, Pisa, Italy, 19-22/06/2022},
	year = {2022}
}
CNR ExploRA

Bibliographic record

ISTI Repository

Published version Open Access

Also available from

ceur-ws.orgOpen Access

TEACHING
A computing toolkit for building efficient autonomous applications leveraging humanistic intelligence


OpenAIRE