1 code implementation • 3 Feb 2024 • Junwoo Park, Daehoon Gwak, Jaegul Choo, Edward Choi
To this end, our contrastive loss incorporates global autocorrelation held in the whole time series, which facilitates the construction of positive and negative pairs in a self-supervised manner.
Ranked #1 on Time Series Forecasting on ETTh1 (720) Univariate
no code implementations • 24 Oct 2021 • Jiyoung Lee, Wonjae Kim, Daehoon Gwak, Edward Choi
Periodic signals play an important role in daily lives.
no code implementations • 18 Oct 2021 • Jeonghoon Park, Jimin Hong, Radhika Dua, Daehoon Gwak, Yixuan Li, Jaegul Choo, Edward Choi
Despite the impressive performance of deep networks in vision, language, and healthcare, unpredictable behaviors on samples from the distribution different than the training distribution cause severe problems in deployment.
no code implementations • 29 Sep 2021 • Daehoon Gwak, Gyubok Lee, Jaehoon Lee, Jaesik Choi, Jaegul Choo, Edward Choi
To address this, we introduce a new neural stochastic processes, Decoupled Kernel Neural Processes (DKNPs), which explicitly learn a separate mean and kernel function to directly model the covariance between output variables in a data-driven manner.
1 code implementation • ICCV 2021 • Sanghun Jung, Jungsoo Lee, Daehoon Gwak, Sungha Choi, Jaegul Choo
However, the distribution of max logits of each predicted class is significantly different from each other, which degrades the performance of identifying unexpected objects in urban-scene segmentation.
Ranked #4 on Anomaly Detection on Lost and Found
no code implementations • 26 Nov 2020 • Jeonghoon Park, Kyungmin Jo, Daehoon Gwak, Jimin Hong, Jaegul Choo, Edward Choi
We evaluate the out-of-distribution (OOD) detection performance of self-supervised learning (SSL) techniques with a new evaluation framework.
Out-of-Distribution Detection Out of Distribution (OOD) Detection +1
1 code implementation • 16 Oct 2020 • Daehoon Gwak, Gyuhyeon Sim, Michael Poli, Stefano Massaroli, Jaegul Choo, Edward Choi
By interpreting the forward dynamics of the latent representation of neural networks as an ordinary differential equation, Neural Ordinary Differential Equation (Neural ODE) emerged as an effective framework for modeling a system dynamics in the continuous time domain.