no code implementations • 5 Sep 2020 • Drew Schmidt, Bronson Messer, M. Todd Young, Michael Matheson
The use of AI and ML for scientific applications is currently a very exciting and dynamic field.
1 code implementation • 15 Oct 2019 • Theodore Papamarkou, Jacob Hinkle, M. Todd Young, David Womble
Nevertheless, this paper shows that a non-converged Markov chain, generated via MCMC sampling from the parameter space of a neural network, can yield via Bayesian marginalization a valuable posterior predictive distribution of the output of the neural network.
no code implementations • 24 Sep 2019 • Nouamane Laanait, Joshua Romero, Junqi Yin, M. Todd Young, Sean Treichler, Vitalii Starchenko, Albina Borisevich, Alex Sergeev, Michael Matheson
We introduce novel communication strategies in synchronous distributed Deep Learning consisting of decentralized gradient reduction orchestration and computational graph-aware grouping of gradient tensors.