1 code implementation • 6 Feb 2024 • David Peer, Philemon Schöpf, Volckmar Nebendahl, Alexander Rietzler, Sebastian Stabinger
However, evaluating GLLMs presents a challenge as the binary true or false evaluation used for discriminative models is not applicable to the predictions made by GLLMs.
1 code implementation • 9 Nov 2022 • Antonio Rodríguez-Sánchez, Simon Haller-Seeber, David Peer, Chris Engelhardt, Jakob Mittelberger, Matteo Saveriano
In the experimental evaluation we will show that our algorithm is superior to current affordance detection methods when faced with grasping previously unseen objects thanks to our Capsule Network enforcing a parts-to-whole representation.
2 code implementations • 1 Aug 2022 • David Peer, Bart Keulen, Sebastian Stabinger, Justus Piater, Antonio Rodríguez-Sánchez
We show empirically that we can therefore train a "vanilla" fully connected network and convolutional neural network -- no skip connections, batch normalization, dropout, or any other architectural tweak -- with 500 layers by simply adding the batch-entropy regularization term to the loss function.
1 code implementation • 26 Jan 2022 • Josef Gugglberger, David Peer, Antonio Rodríguez-Sánchez
MoCapsNets are inspired by Momentum ResNets, a type of network that applies reversible residual building blocks.
1 code implementation • 31 May 2021 • David Peer, Sebastian Stabinger, Stefan Engl, Antonio Rodriguez-Sanchez
Knowledge distillation maintains high performance and reaches high compression rates, nevertheless, the size of the student model is fixed after pre-training and can not be changed individually for a given downstream task and use-case to reach a desired performance/speedup ratio.
1 code implementation • 15 Apr 2021 • Josef Gugglberger, David Peer, Antonio Rodriguez-Sanchez
Capsule networks are a type of neural network that have recently gained increased popularity.
1 code implementation • 7 Mar 2021 • David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez
In the worst-case scenario, we prove that such a layer could lead to a network that cannot be trained at all.
no code implementations • 23 Feb 2021 • Sebastian Stabinger, David Peer, Antonio Rodríguez-Sánchez
Convolutional neural networks have established themselves over the past years as the state of the art method for image classification, and for many datasets, they even surpass humans in categorizing images.
1 code implementation • 5 Nov 2020 • David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez
In this paper, we introduce a novel theory and metric to identify layers that decrease the test accuracy of the trained models, this identification is done as early as at the beginning of training.
no code implementations • 21 May 2019 • David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez
A recently proposed method in deep learning groups multiple neurons to capsules such that each capsule represents an object or part of an object.
1 code implementation • 23 Dec 2018 • David Peer, Sebastian Stabinger, Antonio Rodriguez-Sanchez
In this paper we introduce a new inductive bias for capsule networks and call networks that use this prior $\gamma$-capsule networks.