Minimizing $f$-Divergences by Interpolating Velocity Fields

24 May 2023  ·  Song Liu, Jiahao Yu, Jack Simons, Mingxuan Yi, Mark Beaumont ·

Many machine learning problems can be formulated as approximating a target distribution using a particle distribution by minimizing a statistical discrepancy. Wasserstein Gradient Flow can be employed to move particles along a path that minimizes the $f$-divergence between the \textit{target} and \textit{particle} distributions. To perform such movements we need to calculate the corresponding velocity fields which include a density ratio function between these two distributions. While previous works estimated the density ratio function first and then differentiated the estimated ratio, this approach may suffer from overfitting, which leads to a less accurate estimate. Inspired by non-parametric curve fitting, we directly estimate these velocity fields using interpolation. We prove that our method is asymptotically consistent under mild conditions. We validate the effectiveness using novel applications on domain adaptation and missing data imputation.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here