no code implementations • 8 May 2024 • Seoyoung Hong, Jeongwhan Choi, Yeon-Chang Lee, Srijan Kumar, Noseong Park
However, existing methods still have room to improve the trade-offs among accuracy, efficiency, and robustness.
no code implementations • 1 May 2024 • Chaejeong Lee, Jeongwhan Choi, Hyowon Wi, Sung-Bae Cho, Noseong Park
In this paper, we propose a novel Stochastic sampling for i) COntrastive views and ii) hard NEgative samples (SCONE) to overcome these issues.
no code implementations • 20 Feb 2024 • Jinsung Jeon, Hyundong Jin, Jonghyun Choi, Sanghyun Hong, Dongeun Lee, Kookjin Lee, Noseong Park
Extensively evaluating methods with seven image recognition benchmarks, we show that the proposed PAC-FNO improves the performance of existing baseline models on images with various resolutions by up to 77. 1% and various types of natural variations in the images at inference.
no code implementations • 27 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Chaejeong Lee, Sung-Bae Cho, Dongha Lee, Noseong Park
Contrastive learning (CL) has emerged as a promising technique for improving recommender systems, addressing the challenge of data sparsity by leveraging self-supervised signals from raw data.
no code implementations • 27 Dec 2023 • Hyowon Wi, Yehjin Shin, Noseong Park
However, it has been overlooked for a long time to design an imputation method based on continuous-time recurrent neural networks (RNNs), i. e., neural controlled differential equations (NCDEs).
1 code implementation • 19 Dec 2023 • Youn-Yeol Yu, Jeongwhan Choi, Woojin Cho, Kookjin Lee, Nayong Kim, Kiseok Chang, Chang-Seung Woo, Ilho Kim, Seok-Woo Lee, Joon-Young Yang, Sooyoung Yoon, Noseong Park
These methods are typically designed to i) reduce the computational cost in solving physical dynamics and/or ii) propose techniques to enhance the solution accuracy in fluid and rigid body dynamics.
no code implementations • 16 Dec 2023 • Woojin Cho, Seunghyeon Cho, Hyundong Jin, Jinsung Jeon, Kookjin Lee, Sanghyun Hong, Dongeun Lee, Jonghyun Choi, Noseong Park
Neural ordinary differential equations (NODEs), one of the most influential works of the differential equation-based deep learning, are to continuously generalize residual networks and opened a new field.
2 code implementations • 16 Dec 2023 • Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
In the SR domain, we, for the first time, show that the same problem occurs.
Ranked #1 on Sequential Recommendation on MovieLens 1M
no code implementations • 12 Dec 2023 • Jayoung Kim, Yehjin Shin, Jeongwhan Choi, Hyowon Wi, Noseong Park
Structured data, which constitutes a significant portion of existing data types, has been a long-standing research topic in the field of machine learning.
no code implementations • 7 Dec 2023 • Jeongwhan Choi, Hyowon Wi, Jayoung Kim, Yehjin Shin, Kookjin Lee, Nathaniel Trask, Noseong Park
Transformers, renowned for their self-attention mechanism, have achieved state-of-the-art performance across various tasks in natural language processing, computer vision, time-series modeling, etc.
no code implementations • 8 Nov 2023 • Seonkyu Lim, Jaehyeon Park, Seojin Kim, Hyowon Wi, Haksoo Lim, Jinsung Jeon, Jeongwhan Choi, Noseong Park
Long-term time series forecasting (LTSF) is a challenging task that has been investigated in various domains such as finance investment, health care, traffic, and weather forecasting.
no code implementations • 29 Aug 2023 • Haksoo Lim, Sewon Park, Minjung Kim, Jaehoon Lee, Seonkyu Lim, Noseong Park
The time-series anomaly detection is one of the most fundamental tasks for time-series.
1 code implementation • 27 Jun 2023 • Sheo Yon Jhin, Jaehoon Lee, Noseong Park
Unlike conventional anomaly detection, which focuses on determining whether a given time series observation is an anomaly or not, PoA detection aims to detect future anomalies before they happen.
1 code implementation • 9 May 2023 • Minju Jo, Seungji Kook, Noseong Park
However, existing neural network-based Hawkes process models not only i) fail to capture such complicated irregular dynamics, but also ii) resort to heuristics to calculate the log-likelihood of events since they are mostly based on neural networks designed for regular discrete inputs.
1 code implementation • 25 Apr 2023 • Chaejeong Lee, Jayoung Kim, Noseong Park
With growing attention to tabular data these days, the attempt to apply a synthetic table to various tasks has been expanded toward various scenarios.
2 code implementations • 20 Mar 2023 • Jeongwhan Choi, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
Ranked #2 on Traffic Prediction on PeMSD7(L)
no code implementations • 23 Jan 2023 • Deokki Hong, Kanghyun Choi, Hye Yoon Lee, Joonsang Yu, Noseong Park, Youngsok Kim, Jinho Lee
Co-exploration of an optimal neural architecture and its hardware accelerator is an approach of rising interest which addresses the computational cost problem, especially in low-profile systems.
no code implementations • 20 Jan 2023 • Haksoo Lim, Minjung Kim, Sewon Park, Noseong Park
We propose a conditional score network for the time-series generation domain.
no code implementations • 11 Jan 2023 • Sheo Yon Jhin, Minju Jo, Seungji Kook, Noseong Park, Sungpil Woo, Sunhwan Lim
Neural controlled differential equations (NCDEs), which are continuous analogues to recurrent neural networks (RNNs), are a specialized model in (irregular) time-series processing.
1 code implementation • 25 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
In particular, diffusion equations have been widely used for designing the core processing layer of GNNs, and therefore they are inevitably vulnerable to the notorious oversmoothing problem.
no code implementations • 22 Nov 2022 • Jaehoon Lee, Chan Kim, Gyumin Lee, Haksoo Lim, Jeongwhan Choi, Kookjin Lee, Dongeun Lee, Sanghyun Hong, Noseong Park
Forecasting future outcomes from recent time series data is not easy, especially when the future data are different from the past (i. e. time series are under temporal drifts).
1 code implementation • 17 Nov 2022 • Jeongwhan Choi, Seoyoung Hong, Noseong Park, Sung-Bae Cho
Various methods have been proposed for collaborative filtering, ranging from matrix factorization to graph convolutional methods.
Ranked #1 on Collaborative Filtering on Gowalla
no code implementations • 8 Nov 2022 • Seoyoung Hong, Minju Jo, Seungji Kook, Jaeeun Jung, Hyowon Wi, Noseong Park, Sung-Bae Cho
We present a time-series forecasting-based upgrade kit (TimeKit), which works in the following way: it i) first decides a base collaborative filtering algorithm, ii) extracts user/item embedding vectors with the base algorithm from user-item interaction logs incrementally, e. g., every month, iii) trains our time-series forecasting model with the extracted time- series of embedding vectors, and then iv) forecasts the future embedding vectors and recommend with their dot-product scores owing to a recent breakthrough in processing complicated time- series data, i. e., neural controlled differential equations (NCDEs).
no code implementations • 10 Oct 2022 • Fan Wu, Sanghyun Hong, Donsub Rim, Noseong Park, Kookjin Lee
However, parameterization of dynamics using a neural network makes it difficult for humans to identify causal structures in the data.
1 code implementation • 8 Oct 2022 • Jayoung Kim, Chaejeong Lee, Noseong Park
Our proposed training strategy includes a self-paced learning technique and a fine-tuning strategy, which further increases the sampling quality and diversity by stabilizing the denoising score matching training.
no code implementations • 5 Oct 2022 • Jinsung Jeon, Jeonghak Kim, Haryong Song, Seunghyeon Cho, Noseong Park
Time series synthesis is an important research topic in the field of deep learning, which can be used for data augmentation.
1 code implementation • 3 Sep 2022 • Taeri Kim, Noseong Park, Jiwon Hong, Sang-Wook Kim
Many cyberattacks start with disseminating phishing URLs.
2 code implementations • 30 Aug 2022 • Seoyoung Hong, Heejoo Shin, Jeongwhan Choi, Noseong Park
Owing to the continuous and bijective characteristics of NODEs, in addition, we design a one-shot price optimization method given a pre-trained prediction model, which requires only one iteration to find the optimal solution.
1 code implementation • 17 Aug 2022 • Jihyeon Hyeong, Jayoung Kim, Noseong Park, Sushil Jajodia
Tabular data typically contains private and important information; thus, precautions must be taken before they are shared with others.
1 code implementation • 13 Jul 2022 • Suneghyeon Cho, Sanghyun Hong, Kookjin Lee, Noseong Park
In this work, we propose adaptive momentum estimation neural ODEs (AdamNODEs) that adaptively control the acceleration of the classical momentum-based approach.
no code implementations • 29 Jun 2022 • Jinsung Jeon, Noseong Park
Score-based generative models (SGMs) show the state-of-the-art sampling quality and diversity.
1 code implementation • 17 Jun 2022 • Jayoung Kim, Chaejeong Lee, Yehjin Shin, Sewon Park, Minjung Kim, Noseong Park, Jihoon Cho
To our knowledge, we are the first presenting a score-based tabular data oversampling method.
1 code implementation • ICLR 2022 • Jaehoon Lee, Jinsung Jeon, Sheo Yon Jhin, Jihyeon Hyeong, Jayoung Kim, Minju Jo, Kook Seungji, Noseong Park
The problem of processing very long time-series data (e. g., a length of more than 10, 000) is a long-standing research problem in machine learning.
no code implementations • 19 Apr 2022 • Sheo Yon Jhin, Jaehoon Lee, Minju Jo, Seungji Kook, Jinsung Jeon, Jihyeon Hyeong, Jayoung Kim, Noseong Park
Deep learning inspired by differential equations is a recent research trend and has marked the state of the art performance for many machine learning tasks.
1 code implementation • CVPR 2022 • Kanghyun Choi, Hye Yoon Lee, Deokki Hong, Joonsang Yu, Noseong Park, Youngsok Kim, Jinho Lee
To deal with the performance drop induced by quantization errors, a popular method is to use training data to fine-tune quantized networks.
no code implementations • 8 Feb 2022 • Jaehoon Lee, Jihyeon Hyeong, Jinsung Jeon, Noseong Park, Jihoon Cho
First, we can further improve the synthesis quality, by decreasing the negative log-density of real records in the process of adversarial training.
1 code implementation • 7 Dec 2021 • Jeongwhan Choi, Hwangyong Choi, Jeehyun Hwang, Noseong Park
A prevalent approach in the field is to combine graph convolutional networks and recurrent neural networks for the spatio-temporal processing.
Ranked #3 on Traffic Prediction on PeMSD7(L)
1 code implementation • NeurIPS 2021 • Jaehoon Lee, Jihyeon Hyeong, Jinsung Jeon, Noseong Park, Jihoon Cho
First, we can further improve the synthesis quality, by decreasing the negative log-density of real records in the process of adversarial training.
2 code implementations • 14 Nov 2021 • Taeyong Kong, Taeri Kim, Jinsung Jeon, Jeongwhan Choi, Yeon-Chang Lee, Noseong Park, Sang-Wook Kim
To our knowledge, we are the first who design a hybrid method and report the correlation between the graph centrality and the linearity/non-linearity of nodes.
2 code implementations • 11 Nov 2021 • Jeehyun Hwang, Jeongwhan Choi, Hwangyong Choi, Kookjin Lee, Dongeun Lee, Noseong Park
On the other hand, neural ordinary differential equations (NODEs) are to learn a latent governing equation of ODE from data.
2 code implementations • NeurIPS 2021 • Kanghyun Choi, Deokki Hong, Noseong Park, Youngsok Kim, Jinho Lee
We find that this is often insufficient to capture the distribution of the original data, especially around the decision boundaries.
Ranked #1 on Data Free Quantization on CIFAR-100
no code implementations • 29 Sep 2021 • Jinsung Jeon, Jeonghak Kim, Haryong Song, Noseong Park
In this paper, we solve the problem of synthesizing irregular and intermittent time-series where values can be missing and may not have specific frequencies, which is far more challenging than existing settings.
no code implementations • 29 Sep 2021 • Jungeun Kim, Seunghyun Hwang, Jeehyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park
In other words, the knowledge contained by the learned governing equation can be injected into the neural network which approximates the PDE solution function.
no code implementations • 29 Sep 2021 • Deokki Hong, Kanghyun Choi, Hey Yoon Lee, Joonsang Yu, Youngsok Kim, Noseong Park, Jinho Lee
To handle the hard constraint problem of differentiable co-exploration, we propose ConCoDE, which searches for hard-constrained solutions without compromising the global design objectives.
1 code implementation • 4 Sep 2021 • Sheo Yon Jhin, Heejoo Shin, Seoyoung Hong, Solhee Park, Noseong Park
Neural networks inspired by differential equations have proliferated for the past several years.
1 code implementation • 11 Aug 2021 • Jinsung Jeon, Soyoung Kang, Minju Jo, Seunghyeon Cho, Noseong Park, Seonghoon Kim, Chiyoung Song
Among various such mobile billboards, taxicab rooftop devices are emerging in the market as a brand new media.
2 code implementations • 8 Aug 2021 • Jeongwhan Choi, Jinsung Jeon, Noseong Park
In this work, we extend them based on neural ordinary differential equations (NODEs), because the linear GCN concept can be interpreted as a differential equation, and present the method of Learnable-Time ODE-based Collaborative Filtering (LT-OCF).
Ranked #1 on Recommendation Systems on Amazon-book
no code implementations • 31 May 2021 • Duanshun Li, Jing Liu, Jinsung Jeon, Seoyoung Hong, Thai Le, Dongwon Lee, Noseong Park
On top of the prediction models, we define a budget-constrained flight frequency optimization problem to maximize the market influence over 2, 262 routes.
1 code implementation • 31 May 2021 • Sheo Yon Jhin, Minju Jo, Taeyong Kong, Jinsung Jeon, Noseong Park
Neural ordinary differential equations (NODEs) presented a new paradigm to construct (continuous-time) neural networks.
1 code implementation • 31 May 2021 • Jayoung Kim, Jinsung Jeon, Jaehoon Lee, Jihyeon Hyeong, Noseong Park
Synthesizing tabular data is attracting much attention these days for various purposes.
no code implementations • 1 Jan 2021 • Jungeun Kim, Seunghyun Hwang, Jihyun Hwang, Kookjin Lee, Dongeun Lee, Noseong Park
Neural ordinary differential equations (neural ODEs) introduced an approach to approximate a neural network as a system of ODEs after considering its layer as a continuous variable and discretizing its hidden dimension.
no code implementations • 1 Jan 2021 • Soyoung Kang, Ganghyeon Park, Kwang-Sung Jun, Noseong Park
Because it is not the case that every input requires the advanced integrator, we design an auxiliary neural network to choose an appropriate integrator given input to decrease the overall inference time without significantly sacrificing accuracy.
1 code implementation • 4 Dec 2020 • Jungeun Kim, Kookjin Lee, Dongeun Lee, Sheo Yon Jin, Noseong Park
We present a method for learning dynamics of complex physical processes described by time-dependent nonlinear partial differential equations (PDEs).
no code implementations • ACL 2021 • Thai Le, Noseong Park, Dongwon Lee
The Universal Trigger (UniTrigger) is a recently-proposed powerful adversarial textual attack method.
1 code implementation • ACL 2022 • Thai Le, Noseong Park, Dongwon Lee
Even though several methods have proposed to defend textual neural network (NN) models against black-box adversarial attacks, they often defend against a specific text perturbation strategy and/or require re-training the models from scratch.
1 code implementation • 10 Nov 2020 • Manh Tuan Do, Noseong Park, Kijung Shin
By adapting five GNN models to our method, we demonstrate the consistent improvement in accuracy and utilization of each GNN's allocated capacity over the original training method of each model up to 5. 4\% points in 12 datasets.
1 code implementation • 24 Jan 2020 • Jihoon Ko, Kyuhan Lee, Kijung Shin, Noseong Park
In this work, we present an inductive machine learning method, called Monte Carlo Simulator (MONSTOR), for estimating the influence of given seed nodes in social networks unseen during training.
no code implementations • 11 Jun 2019 • Duanshun Li, Jing Liu, Noseong Park, Dongeun Lee, Giridhar Ramachandran, Ali Seyedmazloom, Kookjin Lee, Chen Feng, Vadim Sokolov, Rajesh Ganesan
0-1 knapsack is of fundamental importance in computer science, business, operations research, etc.
no code implementations • 9 Jun 2018 • Noseong Park, Mahmoud Mohammadi, Kshitij Gorde, Sushil Jajodia, Hongkyu Park, Youngmin Kim
We call this property model compatibility.
Databases Cryptography and Security H.3.4; I.2; K.6.5
no code implementations • 7 May 2018 • David Keetae Park, Seungjoo Yoo, Hyojin Bahng, Jaegul Choo, Noseong Park
Recently, generative adversarial networks (GANs) have shown promising performance in generating realistic images.
no code implementations • 26 Jul 2017 • Noseong Park, Ankesh Anand, Joel Ruben Antony Moniz, Kookjin Lee, Tanmoy Chakraborty, Jaegul Choo, Hongkyu Park, Young-Min Kim
MMGAN finds two manifolds representing the vector representations of real and fake images.
2 code implementations • 5 Dec 2016 • Ankesh Anand, Tanmoy Chakraborty, Noseong Park
Online content publishers often use catchy headlines for their articles in order to attract users to their websites.