1 code implementation • 7 Dec 2021 • Marouane Yassine, David Beauchemin, François Laviolette, Luc Lamontagne
While these models yield notable results, previous work on neural networks has only focused on parsing addresses from a single source country.
no code implementations • 28 Oct 2021 • Louis Fortier-Dubois, Gaël Letarte, Benjamin Leblanc, François Laviolette, Pascal Germain
Considering a probability distribution over parameters is known as an efficient strategy to learn a neural network with non-differentiable activation functions.
1 code implementation • 26 Jul 2021 • Florian Tambon, Gabriel Laberge, Le An, Amin Nikanjam, Paulina Stevia Nouwou Mindom, Yann Pequignot, Foutse khomh, Giulio Antoniol, Ettore Merlo, François Laviolette
Method: We conduct a Systematic Literature Review (SLR) of research papers published between 2015 to 2020, covering topics related to the certification of ML systems.
no code implementations • 24 Oct 2020 • Yann Pequignot, Mathieu Alain, Patrick Dallaire, Alireza Yeganehparast, Pascal Germain, Josée Desharnais, François Laviolette
Focusing on regression tasks, we choose a simple yet insightful model for this OOD distribution and conduct an empirical evaluation of the ability of various methods to discriminate OOD samples from the data.
Out-of-Distribution Detection Out of Distribution (OOD) Detection +2
3 code implementations • 29 Jun 2020 • Marouane Yassine, David Beauchemin, François Laviolette, Luc Lamontagne
We propose an approach in which we employ subword embeddings and a Recurrent Neural Network architecture to build a single model capable of learning to parse addresses from multiple countries at the same time while taking into account the difference in languages and address formatting systems.
no code implementations • 21 Dec 2019 • Ulysse Côté-Allard, Gabriel Gagnon-Turcotte, Angkoon Phinyomark, Kyrre Glette, Erik Scheme, François Laviolette, Benoit Gosselin
Surface electromyography (sEMG) provides an intuitive and non-invasive interface from which to control machines.
1 code implementation • 16 Dec 2019 • Ulysse Côté-Allard, Gabriel Gagnon-Turcotte, Angkoon Phinyomark, Kyrre Glette, Erik Scheme, François Laviolette, Benoit Gosselin
The ability of the dynamic dataset to serve as a benchmark is leveraged to evaluate the impact of different recalibration techniques for long-term (across-day) gesture recognition, including a novel algorithm, named TADANN.
1 code implementation • 30 Nov 2019 • Ulysse Côté-Allard, Evan Campbell, Angkoon Phinyomark, François Laviolette, Benoit Gosselin, Erik Scheme
Using ADANN-generated features, the main contribution of this work is to provide the first topological data analysis of EMG-based gesture recognition for the characterisation of the information encoded within a deep network, using handcrafted features as landmarks.
no code implementations • 25 Sep 2019 • Prudencio Tossou, Basile Dura, Daniel Cohen, Mario Marchand, François Laviolette, Alexandre Lacoste
Due to the significant costs of data generation, many prediction tasks within drug discovery are by nature few-shot regression (FSR) problems, including accurate modelling of biological assays.
1 code implementation • NeurIPS 2019 • Gaël Letarte, Pascal Germain, Benjamin Guedj, François Laviolette
We present a comprehensive study of multilayer neural networks with binary activation, relying on the PAC-Bayesian theory.
4 code implementations • 10 Jan 2018 • Ulysse Côté-Allard, Cheikh Latyr Fall, Alexandre Drouin, Alexandre Campeau-Lecours, Clément Gosselin, Kyrre Glette, François Laviolette, Benoit Gosselin
Consequently, this paper proposes applying transfer learning on aggregated data from multiple users, while leveraging the capacity of deep learning algorithms to learn discriminant features from large datasets.
1 code implementation • NeurIPS 2017 • Alexandre Drouin, Toby Dylan Hocking, François Laviolette
Learning a regression function using censored or interval-valued output data is an important problem in fields such as genomics and medicine.
no code implementations • 17 Jul 2017 • Pascal Germain, Amaury Habrard, François Laviolette, Emilie Morvant
Firstly, we propose an improvement of the previous approach we proposed in Germain et al. (2013), which relies on a novel distribution pseudodistance based on a disagreement averaging, allowing us to derive a new tighter domain adaptation bound for the target risk.
1 code implementation • 3 Dec 2016 • Alexandre Drouin, Frédéric Raymond, Gaël Letarte St-Pierre, Mario Marchand, Jacques Corbeil, François Laviolette
Antimicrobial resistance is an important public health concern that has implications in the practice of medicine worldwide.
1 code implementation • 15 Jun 2015 • Pascal Germain, Amaury Habrard, François Laviolette, Emilie Morvant
We study the issue of PAC-Bayesian domain adaptation: We want to learn, from a source domain, a majority vote model dedicated to a target one.
no code implementations • 8 Jun 2015 • Louis Fortier-Dubois, François Laviolette, Mario Marchand, Louis-Emile Robitaille, Jean-Francis Roy
We first present a general risk bound for ensembles that depends on the Lp norm of the weighted combination of voters which can be selected from a continuous set.
35 code implementations • 28 May 2015 • Yaroslav Ganin, Evgeniya Ustinova, Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand, Victor Lempitsky
Our approach is directly inspired by the theory on domain adaptation suggesting that, for effective domain transfer to be achieved, predictions must be made based on features that cannot discriminate between the training (source) and test (target) domains.
Ranked #2 on Domain Adaptation on Synth Digits-to-SVHN
no code implementations • 22 May 2015 • Alexandre Drouin, Sébastien Giguère, Maxime Déraspe, François Laviolette, Mario Marchand, Jacques Corbeil
The Set Covering Machine (SCM) is a greedy learning algorithm that produces sparse classifiers.
no code implementations • 28 Mar 2015 • Pascal Germain, Alexandre Lacasse, François Laviolette, Mario Marchand, Jean-Francis Roy
We propose an extensive analysis of the behavior of majority votes in binary classification.
no code implementations • 24 Mar 2015 • Pascal Germain, Amaury Habrard, François Laviolette, Emilie Morvant
In this paper, we provide two main contributions in PAC-Bayesian theory for domain adaptation where the objective is to learn, from a source distribution, a well-performing majority vote on a different target distribution.
1 code implementation • 15 Dec 2014 • Hana Ajakan, Pascal Germain, Hugo Larochelle, François Laviolette, Mario Marchand
We propose a training objective that implements this idea in the context of a neural network, whose hidden layer is trained to be predictive of the classification task, but uninformative as to the domain of the input.
no code implementations • 3 Dec 2014 • Sébastien Giguère, Amélie Rolland, François Laviolette, Mario Marchand
This work uses a recent result on combinatorial optimization of linear predictors based on string kernels to develop, for the pre-image, a low complexity upper bound valid for many string kernels.
no code implementations • 2 Dec 2014 • Alexandre Drouin, Sébastien Giguère, Vladana Sagatovich, Maxime Déraspe, François Laviolette, Mario Marchand, Jacques Corbeil
The increased affordability of whole genome sequencing has motivated its use for phenotypic studies.
no code implementations • 6 Aug 2014 • François Laviolette, Emilie Morvant, Liva Ralaivola, Jean-Francis Roy
This paper generalizes an important result from the PAC-Bayesian literature for binary classification to the case of ensemble methods for structured outputs.
no code implementations • 4 Feb 2014 • Alexandre Lacoste, Hugo Larochelle, François Laviolette, Mario Marchand
One of the most tedious tasks in the application of machine learning is model selection, i. e. hyperparameter selection.
no code implementations • NeurIPS 2011 • Yevgeny Seldin, Peter Auer, John S. Shawe-Taylor, Ronald Ortner, François Laviolette
The scaling of our regret bound with the number of states (contexts) $N$ goes as $\sqrt{N I_{\rho_t}(S;A)}$, where $I_{\rho_t}(S;A)$ is the mutual information between states and actions (the side information) used by the algorithm at round $t$.
no code implementations • NeurIPS 2009 • Pascal Germain, Alexandre Lacasse, Mario Marchand, Sara Shanian, François Laviolette
We show that standard ell_p-regularized objective functions currently used, such as ridge regression and ell_p-regularized boosting, are obtained from a relaxation of the KL divergence between the quasi uniform posterior and the uniform prior.
no code implementations • NeurIPS 2008 • Massih Amini, Nicolas Usunier, François Laviolette
In this case, we propose a second bound on the joint probability that the voted classifier makes an error over an example having its margin over a fixed threshold.