1 code implementation • 16 May 2024 • Ben Tu, Nikolas Kantas, Robert M. Lee, Behrang Shafei
Robustification refers to the strategy that is used to marginalise over the uncertainty in the problem.
1 code implementation • 2 May 2024 • Ben Tu, Nikolas Kantas, Robert M. Lee, Behrang Shafei
As a motivating example, we investigate how these statistics can be used within a design of experiments setting, where the goal is to both infer and use the Pareto front surface distribution in order to make effective decisions.
1 code implementation • 19 May 2023 • Ben Tu, Nikolas Kantas, Robert M. Lee, Behrang Shafei
As part of our work, we show that these utilities are monotone and submodular set functions which can be optimised effectively using greedy optimisation algorithms.
1 code implementation • 2 Jul 2022 • Alexander Thebelt, Calvin Tsay, Robert M. Lee, Nathan Sudermann-Merx, David Walz, Behrang Shafei, Ruth Misener
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search, as they achieve good predictive performance with little or no manual tuning, naturally handle discrete feature spaces, and are relatively insensitive to outliers in the training data.
1 code implementation • 4 Nov 2021 • Alexander Thebelt, Calvin Tsay, Robert M. Lee, Nathan Sudermann-Merx, David Walz, Tom Tranter, Ruth Misener
Energy systems optimization problems are complex due to strongly non-linear system behavior and multiple competing objectives, e. g. economic gain vs. environmental impact.
1 code implementation • 10 Mar 2020 • Alexander Thebelt, Jan Kronqvist, Miten Mistry, Robert M. Lee, Nathan Sudermann-Merx, Ruth Misener
Gradient boosted trees and other regression tree models perform well in a wide range of real-world, industrial applications.
1 code implementation • 2 Mar 2018 • Miten Mistry, Dimitrios Letsios, Gerhard Krennrich, Robert M. Lee, Ruth Misener
Decision trees usefully represent sparse, high dimensional and noisy data.