Search Results for author: Dmitry Kobak

Found 13 papers, 13 papers with code

Learning representations of learning representations

2 code implementations12 Apr 2024 Rita González-Márquez, Dmitry Kobak

The ICLR conference is unique among the top machine learning conferences in that all submitted papers are openly available.

Sentence

Self-supervised Visualisation of Medical Image Datasets

1 code implementation22 Feb 2024 Ifeoma Veronica Nwabufo, Jan Niklas Böhm, Philipp Berens, Dmitry Kobak

Self-supervised learning methods based on data augmentations, such as SimCLR, BYOL, or DINO, allow obtaining semantically meaningful representations of image datasets and are widely used prior to supervised fine-tuning.

Contrastive Learning Self-Supervised Learning

Persistent Homology for High-dimensional Data Based on Spectral Methods

1 code implementation6 Nov 2023 Sebastian Damrich, Philipp Berens, Dmitry Kobak

As a remedy, we find that spectral distances on the $k$-nearest-neighbor graph of the data, such as diffusion distance and effective resistance, allow to detect the correct topology even in the presence of high-dimensional noise.

Unsupervised visualization of image datasets using contrastive learning

1 code implementation18 Oct 2022 Jan Niklas Böhm, Philipp Berens, Dmitry Kobak

This problem can be circumvented by self-supervised approaches based on contrastive learning, such as SimCLR, relying on data augmentation to generate implicit neighbors, but these methods do not produce two-dimensional embeddings suitable for visualization.

Contrastive Learning Data Augmentation

From $t$-SNE to UMAP with contrastive learning

2 code implementations3 Jun 2022 Sebastian Damrich, Jan Niklas Böhm, Fred A. Hamprecht, Dmitry Kobak

We exploit this new conceptual connection to propose and implement a generalization of negative sampling, allowing us to interpolate between (and even extrapolate beyond) $t$-SNE and UMAP and their respective embeddings.

Contrastive Learning Representation Learning

Wasserstein t-SNE

2 code implementations16 May 2022 Fynn Bachmann, Philipp Hennig, Dmitry Kobak

We use t-SNE to construct 2D embeddings of the units, based on the matrix of pairwise Wasserstein distances between them.

Tracking excess mortality across countries during the COVID-19 pandemic with the World Mortality Dataset

1 code implementation eLife 2021 Ariel Karlinsky, Dmitry Kobak

Comparing the impact of the COVID-19 pandemic between countries or across time is difficult because the reported numbers of cases and deaths can be strongly affected by testing capacity and reporting policy.

Scaling Down Deep Learning with MNIST-1D

1 code implementation29 Nov 2020 Sam Greydanus, Dmitry Kobak

Although deep learning models have taken on commercial and political relevance, key aspects of their training and operation remain poorly understood.

Attraction-Repulsion Spectrum in Neighbor Embeddings

1 code implementation17 Jul 2020 Jan Niklas Böhm, Philipp Berens, Dmitry Kobak

Neighbor embeddings are a family of methods for visualizing complex high-dimensional datasets using $k$NN graphs.

Sparse bottleneck neural networks for exploratory non-linear visualization of Patch-seq data

2 code implementations18 Jun 2020 Yves Bernaerts, Philipp Berens, Dmitry Kobak

Patch-seq, a recently developed experimental technique, allows neuroscientists to obtain transcriptomic and electrophysiological information from the same neurons.

Heavy-tailed kernels reveal a finer cluster structure in t-SNE visualisations

2 code implementations15 Feb 2019 Dmitry Kobak, George Linderman, Stefan Steinerberger, Yuval Kluger, Philipp Berens

T-distributed stochastic neighbour embedding (t-SNE) is a widely used data visualisation technique.

Optimal ridge penalty for real-world high-dimensional data can be zero or negative due to the implicit ridge regularization

1 code implementation28 May 2018 Dmitry Kobak, Jonathan Lomond, Benoit Sanchez

We use a spiked covariance model as an analytically tractable example and prove that the optimal ridge penalty in this case is negative when $n\ll p$.

Cannot find the paper you are looking for? You can Submit a new open access paper.