1 code implementation • 16 Feb 2024 • Louis Grenioux, Maxence Noble, Marylou Gabrié, Alain Oliviero Durmus
Building upon score-based learning, new interest in stochastic localization techniques has recently emerged.
1 code implementation • NeurIPS 2023 • Maxence Noble, Valentin De Bortoli, Arnaud Doucet, Alain Durmus
In this paper, we consider an entropic version of mOT with a tree-structured quadratic cost, i. e., a function that can be written as a sum of pairwise cost functions between the nodes of a tree.
no code implementations • 13 Apr 2023 • Giacomo Greco, Maxence Noble, Giovanni Conforti, Alain Durmus
Our approach is novel in that it is purely probabilistic and relies on coupling by reflection techniques for controlled diffusions on the torus.
1 code implementation • NeurIPS 2023 • Maxence Noble, Valentin De Bortoli, Alain Durmus
In this paper, we propose Barrier Hamiltonian Monte Carlo (BHMC), a version of the HMC algorithm which aims at sampling from a Gibbs distribution $\pi$ on a manifold $\mathrm{M}$, endowed with a Hessian metric $\mathfrak{g}$ derived from a self-concordant barrier.
1 code implementation • 17 Nov 2021 • Maxence Noble, Aurélien Bellet, Aymeric Dieuleveut
Federated Learning (FL) is a paradigm for large-scale distributed learning which faces two key challenges: (i) efficient training from highly heterogeneous user data, and (ii) protecting the privacy of participating users.