1 code implementation • NeurIPS 2017 • Yunus Saatci, Andrew G. Wilson
Generative adversarial networks (GANs) can implicitly learn rich distributions over images, audio, and data which are hard to model with an explicit likelihood.
no code implementations • NeurIPS 2017 • Phillip A. Jang, Andrew Loeb, Matthew Davidow, Andrew G. Wilson
We propose a distribution over kernels formed by modelling a spectral mixture density with a Levy process.
no code implementations • 29 Jun 2015 • Junier Oliva, Avinava Dubey, Andrew G. Wilson, Barnabas Poczos, Jeff Schneider, Eric P. Xing
In this paper we introduce Bayesian nonparmetric kernel-learning (BaNK), a generic, data-driven framework for scalable learning of kernels.
no code implementations • NeurIPS 2014 • Andrew G. Wilson, Elad Gilboa, Arye Nehorai, John P. Cunningham
This difficulty is compounded by the fact that Gaussian processes are typically only tractable for small datasets, and scaling an expressive kernel learning approach poses different challenges than scaling a standard Gaussian process model.
no code implementations • NeurIPS 2010 • Andrew G. Wilson, Zoubin Ghahramani
We define a copula process which describes the dependencies between arbitrarily many random variables independently of their marginal distributions.