no code implementations • 25 Jun 2022 • Syrine Belakaria, Janardhan Rao Doppa, Nicolo Fusi, Rishit Sheth
The rising growth of deep neural networks (DNNs) and datasets in size motivates the need for efficient solutions for simultaneous model selection and training.
1 code implementation • 7 Apr 2020 • Yadi Wei, Rishit Sheth, Roni Khardon
The application of DLM in non-conjugate cases is more complex because the logarithm of expectation in the log-loss DLM objective is often intractable and simple sampling leads to biased estimates of gradients.
no code implementations • 20 Mar 2020 • Diana Cai, Rishit Sheth, Lester Mackey, Nicolo Fusi
Meta-learning leverages related source tasks to learn an initialization that can be quickly fine-tuned to a target task with limited labeled examples.
no code implementations • pproximateinference AABI Symposium 2019 • Rishit Sheth, Roni Khardon
Our criterion can be used to derive new sparse Gaussian process algorithms that have error guarantees applicable to various likelihoods.
no code implementations • 27 Aug 2019 • Rishit Sheth, Nicolo Fusi
In this paper we introduce Feature Gradients, a gradient-based search algorithm for feature selection.
no code implementations • NeurIPS 2017 • Rishit Sheth, Roni Khardon
The paper furthers such analysis by providing bounds on the excess risk of variational inference algorithms and related regularized loss minimization algorithms for a large class of latent variable models with Gaussian latent variables.
1 code implementation • NeurIPS 2018 • Nicolo Fusi, Rishit Sheth, Huseyn Melih Elibol
Automating the selection and tuning of machine learning pipelines consisting of data pre-processing methods and machine learning models, has long been one of the goals of the machine learning community.
no code implementations • 12 Dec 2016 • Rishit Sheth, Roni Khardon
The stochastic variational inference (SVI) paradigm, which combines variational inference, natural gradients, and stochastic updates, was recently proposed for large-scale data analysis in conjugate Bayesian models and demonstrated to be effective in several problems.