1 code implementation • 4 Apr 2023 • Qi Chen, Mario Marchand
We further provide algorithm-dependent generalization bounds for these two settings, where the generalization is characterized by the mutual information between the parameters and the data.
no code implementations • 18 Oct 2022 • Jean-Samuel Leboeuf, Frédéric LeBlanc, Mario Marchand
Furthermore, we show that the VC dimension of a binary tree structure with $L_T$ leaves on examples of $\ell$ real-valued features is in $O(L_T \log(L_T\ell))$.
1 code implementation • 29 Oct 2021 • Jean-Samuel Leboeuf, Frédéric LeBlanc, Mario Marchand
We significantly improve the generalization bounds for VC classes by using two main ideas.
1 code implementation • 26 Oct 2021 • Gabriel Laberge, Yann Pequignot, Alexandre Mathieu, Foutse khomh, Mario Marchand
In this work, instead of aiming at reducing the under-specification of model explanations, we fully embrace it and extract logical statements about feature attributions that are consistent across all models with good empirical performance (i. e. all models in the Rashomon Set).
1 code implementation • NeurIPS 2021 • Qi Chen, Changjian Shui, Mario Marchand
We derive a novel information-theoretic analysis of the generalization property of meta-learning algorithms.
1 code implementation • NeurIPS 2020 • Jean-Samuel Leboeuf, Frédéric LeBlanc, Mario Marchand
We introduce the notion of partitioning function, and we relate it to the growth function and to the VC dimension.
no code implementations • 25 Sep 2019 • Prudencio Tossou, Basile Dura, Daniel Cohen, Mario Marchand, François Laviolette, Alexandre Lacoste
Due to the significant costs of data generation, many prediction tasks within drug discovery are by nature few-shot regression (FSR) problems, including accurate modelling of biological assays.
no code implementations • 28 May 2019 • Prudencio Tossou, Basile Dura, Francois Laviolette, Mario Marchand, Alexandre Lacoste
Deep kernel learning provides an elegant and principled framework for combining the structural properties of deep learning algorithms with the flexibility of kernel methods.
1 code implementation • 3 Dec 2016 • Alexandre Drouin, Frédéric Raymond, Gaël Letarte St-Pierre, Mario Marchand, Jacques Corbeil, François Laviolette
Antimicrobial resistance is an important public health concern that has implications in the practice of medicine worldwide.
no code implementations • 8 Jun 2015 • Louis Fortier-Dubois, François Laviolette, Mario Marchand, Louis-Emile Robitaille, Jean-Francis Roy
We first present a general risk bound for ensembles that depends on the Lp norm of the weighted combination of voters which can be selected from a continuous set.
35 code implementations • 28 May 2015 • Yaroslav Ganin, Evgeniya Ustinova, Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand, Victor Lempitsky
Our approach is directly inspired by the theory on domain adaptation suggesting that, for effective domain transfer to be achieved, predictions must be made based on features that cannot discriminate between the training (source) and test (target) domains.
Ranked #2 on Domain Adaptation on Synth Digits-to-SVHN
no code implementations • 22 May 2015 • Alexandre Drouin, Sébastien Giguère, Maxime Déraspe, François Laviolette, Mario Marchand, Jacques Corbeil
The Set Covering Machine (SCM) is a greedy learning algorithm that produces sparse classifiers.
no code implementations • 28 Mar 2015 • Pascal Germain, Alexandre Lacasse, François Laviolette, Mario Marchand, Jean-Francis Roy
We propose an extensive analysis of the behavior of majority votes in binary classification.
1 code implementation • 15 Dec 2014 • Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand
We propose a training objective that implements this idea in the context of a neural network, whose hidden layer is trained to be predictive of the classification task, but uninformative as to the domain of the input.
no code implementations • 3 Dec 2014 • Sébastien Giguère, Amélie Rolland, François Laviolette, Mario Marchand
This work uses a recent result on combinatorial optimization of linear predictors based on string kernels to develop, for the pre-image, a low complexity upper bound valid for many string kernels.
no code implementations • 2 Dec 2014 • Alexandre Drouin, Sébastien Giguère, Vladana Sagatovich, Maxime Déraspe, François Laviolette, Mario Marchand, Jacques Corbeil
The increased affordability of whole genome sequencing has motivated its use for phenotypic studies.
no code implementations • NeurIPS 2014 • Mario Marchand, Hongyu Su, Emilie Morvant, Juho Rousu, John S. Shawe-Taylor
We show that the usual score function for conditional Markov networks can be written as the expectation over the scores of their spanning trees.
no code implementations • 4 Feb 2014 • Alexandre Lacoste, Hugo Larochelle, François Laviolette, Mario Marchand
One of the most tedious tasks in the application of machine learning is model selection, i. e. hyperparameter selection.
no code implementations • 30 Jun 2011 • Zakria Hussain, John Shawe-Taylor, Mario Marchand
In this paper, we correct an upper bound, presented in~\cite{hs-11}, on the generalisation error of classifiers learned through multiple kernel learning.
no code implementations • NeurIPS 2009 • Pascal Germain, Alexandre Lacasse, Mario Marchand, Sara Shanian, François Laviolette
We show that standard ell_p-regularized objective functions currently used, such as ridge regression and ell_p-regularized boosting, are obtained from a relaxation of the KL divergence between the quasi uniform posterior and the uniform prior.