Search Results for author: Ulf Krumnack

Found 10 papers, 2 papers with code

Learning Disentangled Audio Representations through Controlled Synthesis

no code implementations16 Feb 2024 Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann

This paper tackles the scarcity of benchmarking data in disentangled auditory representation learning.

Benchmarking Disentanglement

Show Me How It's Done: The Role of Explanations in Fine-Tuning Language Models

no code implementations12 Feb 2024 Mohamad Ballout, Ulf Krumnack, Gunther Heidemann, Kai-Uwe Kuehnberger

Our research demonstrates the significant benefits of using fine-tuning with explanations to enhance the performance of language models.

Learning Disentangled Speech Representations

no code implementations4 Nov 2023 Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann

This benchmark dataset and framework address the gap in the rigorous evaluation of state-of-the-art disentangled speech representation learning methods.

Disentanglement

Understanding Self-Supervised Learning of Speech Representation via Invariance and Redundancy Reduction

no code implementations7 Sep 2023 Yusuf Brima, Ulf Krumnack, Simone Pika, Gunther Heidemann

This study provides an empirical analysis of Barlow Twins (BT), an SSL technique inspired by theories of redundancy reduction in human perception.

Keyword Spotting Self-Supervised Learning +1

Opening the Black Box: Analyzing Attention Weights and Hidden States in Pre-trained Language Models for Non-language Tasks

1 code implementation21 Jun 2023 Mohamad Ballout, Ulf Krumnack, Gunther Heidemann, Kai-Uwe Kühnberger

Investigating deep learning language models has always been a significant research area due to the ``black box" nature of most advanced models.

Language Modelling ListOps

Go with the Flow: the distribution of information processing in multi-path networks

no code implementations29 Sep 2021 Mats Leon Richter, Krupal Shah, Anna Wiedenroth, Saketh Bachu, Ulf Krumnack

The architectures of convolution neural networks (CNN) have a great impact on the predictive performance and efficiency of the model.

Should You Go Deeper? Optimizing Convolutional Neural Network Architectures without Training by Receptive Field Analysis

1 code implementation23 Jun 2021 Mats L. Richter, Julius Schöning, Anna Wiedenroth, Ulf Krumnack

When optimizing convolutional neural networks (CNN) for a specific image-based task, specialists commonly overshoot the number of convolutional layers in their designs.

Exploring the Properties and Evolution of Neural Network Eigenspaces during Training

no code implementations17 Jun 2021 Mats L. Richter, Leila Malihi, Anne-Kathrin Patricia Windler, Ulf Krumnack

In this work we explore the information processing inside neural networks using logistic regression probes \cite{probes} and the saturation metric \cite{featurespace_saturation}.

regression

Size Matters

no code implementations2 Feb 2021 Mats L. Richter, Wolf Byttner, Ulf Krumnack, Ludwdig Schallner, Justin Shenk

Fully convolutional neural networks can process input of arbitrary size by applying a combination of downsampling and pooling.

Cannot find the paper you are looking for? You can Submit a new open access paper.