Search Results for author: Laetitia Kameni

Found 7 papers, 6 papers with code

Federated Learning for Data Streams

1 code implementation4 Jan 2023 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning (FL) is an effective solution to train machine learning models on the increasing amount of data generated by IoT devices and smartphones while keeping such data localized.

Federated Learning

SIFU: Sequential Informed Federated Unlearning for Efficient and Provable Client Unlearning in Federated Optimization

1 code implementation21 Nov 2022 Yann Fraboni, Martin Van Waerebeke, Kevin Scaman, Richard Vidal, Laetitia Kameni, Marco Lorenzi

Machine Unlearning (MU) is an increasingly important topic in machine learning safety, aiming at removing the contribution of a given data point from a training procedure.

Machine Unlearning

A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates

no code implementations21 Jun 2022 Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi

We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff.

Federated Learning

Personalized Federated Learning through Local Memorization

2 code implementations17 Nov 2021 Othmane Marfoq, Giovanni Neglia, Laetitia Kameni, Richard Vidal

Federated learning allows clients to collaboratively learn statistical models while keeping their data local.

Binary Classification Fairness +3

Federated Multi-Task Learning under a Mixture of Distributions

4 code implementations NeurIPS 2021 Othmane Marfoq, Giovanni Neglia, Aurélien Bellet, Laetitia Kameni, Richard Vidal

The increasing size of data generated by smartphones and IoT devices motivated the development of Federated Learning (FL), a framework for on-device collaborative training of machine learning models.

Fairness Multi-Task Learning +1

A General Theory for Client Sampling in Federated Learning

1 code implementation26 Jul 2021 Yann Fraboni, Richard Vidal, Laetitia Kameni, Marco Lorenzi

In this work, we provide a general theoretical framework to quantify the impact of a client sampling scheme and of the clients heterogeneity on the federated optimization.

Federated Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.