Search Results for author: Iosif Lytras

Found 4 papers, 0 papers with code

Tamed Langevin sampling under weaker conditions

no code implementations27 May 2024 Iosif Lytras, Panayotis Mertikopoulos

Motivated by applications to deep learning which often fail standard Lipschitz smoothness requirements, we examine the problem of sampling from distributions that are not log-concave and are only weakly dissipative, with log-gradients allowed to grow superlinearly at infinity.

Taming under isoperimetry

no code implementations15 Nov 2023 Iosif Lytras, Sotirios Sabanis

In this article we propose a novel taming Langevin-based scheme called $\mathbf{sTULA}$ to sample from distributions with superlinearly growing log-gradient which also satisfy a Log-Sobolev inequality.

Kinetic Langevin MCMC Sampling Without Gradient Lipschitz Continuity -- the Strongly Convex Case

no code implementations19 Jan 2023 Tim Johnston, Iosif Lytras, Sotirios Sabanis

In this article we consider sampling from log concave distributions in Hamiltonian setting, without assuming that the objective gradient is globally Lipschitz.

Taming neural networks with TUSLA: Non-convex learning via adaptive stochastic gradient Langevin algorithms

no code implementations25 Jun 2020 Attila Lovas, Iosif Lytras, Miklós Rásonyi, Sotirios Sabanis

We offer a new learning algorithm based on an appropriately constructed variant of the popular stochastic gradient Langevin dynamics (SGLD), which is called tamed unadjusted stochastic Langevin algorithm (TUSLA).

Cannot find the paper you are looking for? You can Submit a new open access paper.