High-dimensional Bayesian Optimization of Hyperparameters for an Attention-based Network to Predict Materials Property: a Case Study on CrabNet using Ax and SAASBO

19 Mar 2022  ·  Sterling G. Baird, Marianne Liu, Taylor D. Sparks ·

Expensive-to-train deep learning models can benefit from an optimization of the hyperparameters that determine the model architecture. We optimize 23 hyperparameters of a materials informatics model, Compositionally-Restricted Attention-Based Network (CrabNet), over 100 adaptive design iterations using two models within the Adaptive Experimentation (Ax) Platform. This includes a recently developed Bayesian optimization (BO) algorithm, sparse axis-aligned subspaces Bayesian optimization (SAASBO), which has shown exciting performance on high-dimensional optimization tasks. Using SAASBO to optimize CrabNet hyperparameters, we demonstrate a new state-of-the-art on the experimental band gap regression task within the materials informatics benchmarking platform, Matbench (~4.5% decrease in mean absolute error (MAE) relative to incumbent). Characteristics of the adaptive design scheme as well as feature importances are described for each of the Ax models. SAASBO has great potential to both improve existing surrogate models, as shown in this work, and in future work, to efficiently discover new, high-performing materials in high-dimensional materials science search spaces.

PDF Abstract