no code implementations • 18 Aug 2023 • Cheik Traoré, Vassilis Apidopoulos, Saverio Salzo, Silvia Villa
Stochastic proximal point algorithms have been studied as an alternative to stochastic gradient algorithms since they are more stable with respect to the choice of the stepsize but a proper variance reduced version is missing.
no code implementations • 24 Nov 2020 • Cheik Traoré, Edouard Pauwels
We prove that the iterates produced by, either the scalar step size variant, or the coordinatewise variant of AdaGrad algorithm, are convergent sequences when applied to convex objective functions with Lipschitz gradient.