no code implementations • 14 Feb 2024 • Simon Geisler, Tom Wollschläger, M. H. I. Abdalla, Johannes Gasteiger, Stephan Günnemann
Current LLM alignment methods are readily broken through specifically crafted adversarial prompts.
no code implementations • 15 Dec 2023 • Sebastian Farquhar, Vikrant Varma, Zachary Kenton, Johannes Gasteiger, Vladimir Mikulik, Rohin Shah
We show that existing unsupervised methods on large language model (LLM) activations do not discover knowledge -- instead they seem to discover whatever feature of the activations is most prominent.
no code implementations • NeurIPS 2023 • Filip Ekström Kelvinius, Dimitar Georgiev, Artur Petrov Toshev, Johannes Gasteiger
In this paper, we explore the utility of knowledge distillation (KD) for accelerating molecular GNNs.
1 code implementation • 8 Mar 2023 • Arthur Kosmala, Johannes Gasteiger, Nicholas Gao, Stephan Günnemann
Neural architectures that learn potential energy surfaces from molecular data have undergone fast improvement in recent years.
no code implementations • 6 Feb 2023 • Jan Schuchardt, Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann
In tasks like node classification, image segmentation, and named-entity recognition we have a classifier that simultaneously outputs multiple predictions (a vector of labels) based on a single input, i. e. a single graph, image, or document respectively.
no code implementations • 18 Dec 2022 • Johannes Gasteiger, Chendi Qian, Stephan Günnemann
Using graph neural networks for large graphs is challenging since there is no clear way of constructing mini-batches.
no code implementations • 6 Apr 2022 • Johannes Gasteiger, Muhammed Shuaibi, Anuroop Sriram, Stephan Günnemann, Zachary Ulissi, C. Lawrence Zitnick, Abhishek Das
This work investigates this question by first developing the GemNet-OC model based on the large Open Catalyst 2020 (OC20) dataset.
Ranked #1 on Initial Structure to Relaxed Energy (IS2RE) on OC20
no code implementations • NeurIPS 2021 • Johannes Gasteiger, Chandan Yeshwanth, Stephan Günnemann
We furthermore set the state of the art on ZINC and coordinate-free QM9 by incorporating synthetic coordinates in the SMP and DimeNet++ models.
no code implementations • 14 Jul 2021 • Johannes Gasteiger, Marten Lienen, Stephan Günnemann
The current best practice for computing optimal transport (OT) is via entropy regularization and Sinkhorn iterations.
4 code implementations • NeurIPS 2021 • Johannes Gasteiger, Florian Becker, Stephan Günnemann
Effectively predicting molecular interactions has the potential to accelerate molecular dynamics by multiple orders of magnitude and thus revolutionize chemical simulations.
6 code implementations • 28 Nov 2020 • Johannes Gasteiger, Shankari Giri, Johannes T. Margraf, Stephan Günnemann
Many important tasks in chemistry revolve around molecules during reactions.
Ranked #5 on Drug Discovery on QM9
1 code implementation • ICML 2020 • Aleksandar Bojchevski, Johannes Gasteiger, Stephan Günnemann
Existing techniques for certifying the robustness of models for discrete data either work only for a small class of models or are general at the expense of efficiency or tightness.
2 code implementations • 3 Jul 2020 • Aleksandar Bojchevski, Johannes Gasteiger, Bryan Perozzi, Amol Kapoor, Martin Blais, Benedek Rózemberczki, Michal Lukasik, Stephan Günnemann
Graph neural networks (GNNs) have emerged as a powerful approach for solving many network mining tasks.
4 code implementations • ICLR 2020 • Johannes Gasteiger, Janek Groß, Stephan Günnemann
Each message is associated with a direction in coordinate space.
Ranked #7 on Drug Discovery on QM9
3 code implementations • NeurIPS 2019 • Johannes Gasteiger, Stefan Weißenberger, Stephan Günnemann
In this work, we remove the restriction of using only the direct neighbors by introducing a powerful, yet spatially localized graph convolution: Graph diffusion convolution (GDC).
Ranked #3 on Node Classification on AMZ Comp
5 code implementations • ICLR 2019 • Johannes Gasteiger, Aleksandar Bojchevski, Stephan Günnemann
We utilize this propagation procedure to construct a simple model, personalized propagation of neural predictions (PPNP), and its fast approximation, APPNP.
Ranked #1 on Node Classification on MS ACADEMIC
General Classification Node Classification on Non-Homophilic (Heterophilic) Graphs