no code implementations • 7 Dec 2023 • Peter Bjørn Jørgensen, Jonas Busk, Ole Winther, Mikkel N. Schmidt
In machine learning energy potentials for atomic systems, forces are commonly obtained as the negative derivative of the energy function with respect to atomic positions.
1 code implementation • 13 Jul 2023 • Thea Brüsch, Mikkel N. Schmidt, Tommy S. Alstrøm
However, for multivariate time series data, the set of input channels often varies between applications, and most existing work does not allow for transfer between datasets with different sets of input channels.
1 code implementation • 23 Jun 2023 • Bo Li, Yasin Esfandiari, Mikkel N. Schmidt, Tommy S. Alstrøm, Sebastian U. Stich
In this paper, we establish a precise and quantifiable correspondence between data heterogeneity and parameters in the convergence rate when a fraction of data is shuffled across clients.
no code implementations • 10 May 2023 • Jonas Busk, Mikkel N. Schmidt, Ole Winther, Tejs Vegge, Peter Bjørn Jørgensen
The proposed method considers both epistemic and aleatoric uncertainty and the total uncertainties are recalibrated post hoc using a nonlinear scaling function to achieve good calibration on previously unseen data, without loss of predictive accuracy.
2 code implementations • CVPR 2023 • Bo Li, Mikkel N. Schmidt, Tommy S. Alstrøm, Sebastian U. Stich
In this paper, we first revisit the widely used FedAvg algorithm in a deep neural network to understand how data heterogeneity influences the gradient updates across the neural network layers.
no code implementations • 25 Feb 2022 • Bo Li, Mikkel N. Schmidt, Tommy S. Alstrøm
We propose a new machine learning technique for Raman spectrum matching, based on contrastive representation learning, that requires no preprocessing and works with as little as a single reference spectrum from each class.
no code implementations • 13 Jul 2021 • Jonas Busk, Peter Bjørn Jørgensen, Arghya Bhowmik, Mikkel N. Schmidt, Ole Winther, Tejs Vegge
In this work we extend a message passing neural network designed specifically for predicting properties of molecules and materials with a calibrated probabilistic predictive distribution.
2 code implementations • 15 May 2019 • Peter Bjørn Jørgensen, Estefanía Garijo del Río, Mikkel N. Schmidt, Karsten Wedel Jacobsen
The possibilities for prediction in a realistic computational screening setting is investigated on a dataset of 5976 ABSe$_3$ selenides with very limited overlap with the OQMD training set.
1 code implementation • 21 Jun 2018 • Philip J. H. Jørgensen, Søren F. V. Nielsen, Jesper L. Hinrich, Mikkel N. Schmidt, Kristoffer H. Madsen, Morten Mørup
The PARAFAC2 is a multimodal factor analysis model suitable for analyzing multi-way data when one of the modes has incomparable observation units, for example because of differences in signal sampling or batch sizes.
5 code implementations • 8 Jun 2018 • Peter Bjørn Jørgensen, Karsten Wedel Jacobsen, Mikkel N. Schmidt
Neural message passing on molecular graphs is one of the most promising methods for predicting formation energy and other properties of molecules and materials.
Ranked #3 on Formation Energy on Materials Project
no code implementations • 14 Dec 2016 • Jesper L. Hinrich, Søren F. V. Nielsen, Nicolai A. B. Riis, Casper T. Eriksen, Jacob Frøsig, Marco D. F. Kristensen, Mikkel N. Schmidt, Kristoffer H. Madsen, Morten Mørup
Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation.
no code implementations • NeurIPS 2016 • Tue Herlau, Mikkel N. Schmidt, Morten Mørup
Statistical methods for network data often parameterize the edge-probability by attributing latent traits such as block structure to the vertices and assume exchangeability in the sense of the Aldous-Hoover representation theorem.
1 code implementation • 4 Jan 2016 • Søren F. V. Nielsen, Kristoffer H. Madsen, Rasmus Røge, Mikkel N. Schmidt, Morten Mørup
We further investigate what drives dynamic states using the model on the entire data collated across subjects and task/rest.
no code implementations • 12 Aug 2015 • Tue Herlau, Morten Mørup, Mikkel N. Schmidt
Dropout has recently emerged as a powerful and simple method for training neural networks preventing co-adaptation by stochastically omitting neurons.
no code implementations • 10 Jul 2015 • Tue Herlau, Mikkel N. Schmidt, Morten Mørup
Recently Caron and Fox (2014) proposed the use of a different notion of exchangeability due to Kallenberg (2009) and obtained a network model which admits power-law behaviour while retaining desirable statistical properties, however this model does not capture latent vertex traits such as block-structure.
no code implementations • 31 May 2014 • Tue Herlau, Morten Mørup, Yee Whye Teh, Mikkel N. Schmidt
Bayesian mixture models are widely applied for unsupervised learning and exploratory data analysis.
no code implementations • 20 Dec 2013 • Mikkel N. Schmidt, Morten Mørup
Modeling structure in complex networks using Bayesian non-parametrics makes it possible to specify flexible model structures and infer the adequate model complexity from the observed data.
no code implementations • 11 Nov 2013 • Tue Herlau, Mikkel N. Schmidt, Morten Mørup
On synthetic data we demonstrate that including the degree correction yields better performance both on recovering the true group structure and predicting missing links when degree heterogeneity is present, whereas performance is on par for data with no degree heterogeneity within clusters.
no code implementations • 5 Nov 2013 • Mikkel N. Schmidt, Tue Herlau, Morten Mørup
Analyzing and understanding the structure of complex relational data is important in many applications including analysis of the connectivity in the human brain.