1 code implementation • 7 Feb 2024 • Tim Dernedde, Daniela Thyssens, Sören Dittrich, Maximilian Stubbemann, Lars Schmidt-Thieme
Our approach, Moco, learns a graph neural network that updates the solution construction procedure based on features extracted from the current search state.
no code implementations • 6 Oct 2023 • Daniela Thyssens, Tim Dernedde, Jonas K. Falkner, Lars Schmidt-Thieme
Neural Combinatorial Optimization has been researched actively in the last eight years.
1 code implementation • 5 Jun 2023 • Christian Löwens, Daniela Thyssens, Emma Andersson, Christina Jenkins, Lars Schmidt-Thieme
Common approaches to SR extraction are evaluated either solely unsupervised or on a small-scale private dataset, as popular public datasets are unlabeled.
1 code implementation • 27 Jun 2022 • Jonas K. Falkner, Daniela Thyssens, Ahmad Bdeir, Lars Schmidt-Thieme
Combinatorial optimization problems are encountered in many practical contexts such as logistics and production, but exact solutions are particularly difficult to find and usually NP-hard for considerable problem sizes.
1 code implementation • 2 May 2022 • Jonas K. Falkner, Daniela Thyssens, Lars Schmidt-Thieme
The neural repair operator is combined with a local search routine, heuristic destruction operators and a selection procedure applied to a small population to arrive at a sophisticated solution approach.
no code implementations • 5 Jan 2022 • Daniela Thyssens, Jonas Falkner, Lars Schmidt-Thieme
Learning to solve combinatorial optimization problems, such as the vehicle routing problem, offers great computational advantages over classical operations research solvers and heuristics.
1 code implementation • 6 Jan 2021 • Shereen Elsayed, Daniela Thyssens, Ahmed Rashed, Hadi Samer Jomaa, Lars Schmidt-Thieme
In this paper, we report the results of prominent deep learning models with respect to a well-known machine learning baseline, a Gradient Boosting Regression Tree (GBRT) model.