1 code implementation • 4 Jan 2023 • Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal
Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.
1 code implementation • 21 Nov 2022 • Yann Fraboni, Martin Van Waerebeke, Kevin Scaman, Richard Vidal, Laetitia Kameni, Marco Lorenzi
Machine Unlearning (MU) is an increasingly important topic in machine learning safety, aiming at removing the contribution of a given data point from a training procedure.
no code implementations • 21 Jun 2022 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff.
2 code implementations • 17 Nov 2021 • Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal
Federated learning allows clients to collaboratively learn statistical models while keeping their data local.
4 code implementations • NeurIPS 2021 • Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal
The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.
1 code implementation • 26 Jul 2021 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
In this work, we provide a general theoretical framework to quantify the impact of a client sampling scheme and of the clients heterogeneity on the federated optimization.
1 code implementation • 12 May 2021 • Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi
This work addresses the problem of optimizing communications between server and clients in federated learning (FL).