no code implementations • 7 May 2024 • Abhijit Bendale, Michael Sapienza, Steven Ripplinger, Simon Gibbs, Jaewon Lee, Pranav Mistry
In this paper, we introduce SUTRA, multilingual Large Language Model architecture capable of understanding, reasoning, and generating text in over 50 languages.
1 code implementation • 4 Sep 2023 • Minsu Kim, Jaewon Lee, Byeonghun Lee, Sunghoon Im, Kyong Hwan Jin
Existing frameworks for image stitching often provide visually reasonable stitchings.
1 code implementation • ICCV 2023 • Sihyeon Kim, Minseok Joo, Jaewon Lee, Juyeon Ko, Juhan Cha, Hyunwoo J. Kim
In this paper, we highlight the importance of part deformation consistency and propose a semantic-aware implicit template learning framework to enable semantically plausible deformation.
no code implementations • 2 Jun 2023 • Chonghyo Joo, Jeongdong Kim, Hyungtae Cho, Jaewon Lee, Sungho Suh, Junghwan Kim
In this paper, we propose a neural network framework that utilizes chemical property information to improve the performance of naphtha composition prediction.
no code implementations • 5 Mar 2023 • Jaewon Lee, Injae Kim, Hwan Heo, Hyunwoo J. Kim
We present a learning framework for reconstructing neural scene representations from a small number of unconstrained tourist photos.
no code implementations • 3 Feb 2023 • Hwan Heo, Taekyung Kim, Jiyoung Lee, Jaewon Lee, Soohyun Kim, Hyunwoo J. Kim, Jin-Hwa Kim
Multi-resolution hash encoding has recently been proposed to reduce the computational cost of neural renderings, such as NeRF.
no code implementations • 2 Feb 2023 • Hwan Heo, Youngjin Oh, Jaewon Lee, Hyunwoo J. Kim
Recent studies have proven that DNNs, unlike human vision, tend to exploit texture information rather than shape.
1 code implementation • CVPR 2023 • Byeonghyun Pak, Jaewon Lee, Kyong Hwan Jin
Our network outperforms both a transformer-based reconstruction and an implicit Fourier representation method in almost upscaling factor, thanks to the positive constraint and compact support of the B-spline basis.
1 code implementation • 5 Jul 2022 • Jaewon Lee, Kwang Pyo Choi, Kyong Hwan Jin
In this paper, we propose a local texture estimator for image warping (LTEW) followed by an implicit neural representation to deform images into continuous shapes.
no code implementations • 19 Jan 2022 • Zhongyi Lin, Louis Feng, Ehsan K. Ardestani, Jaewon Lee, John Lundell, Changkyu Kim, Arun Kejariwal, John D. Owens
We show that our general performance model not only achieves low prediction error on DLRM, which has highly customized configurations and is dominated by multiple factors but also yields comparable accuracy on other compute-bound ML models targeted by most previous methods.
1 code implementation • CVPR 2022 • Jaewon Lee, Kyong Hwan Jin
Recent works with an implicit neural function shed light on representing images in arbitrary resolution.
Ranked #6 on Image Super-Resolution on Set5 - 3x upscaling
1 code implementation • ICCV 2021 • Sihyeon Kim, Sanghyeok Lee, Dasol Hwang, Jaewon Lee, Seong Jae Hwang, Hyunwoo J. Kim
Although data augmentation is a standard approach to compensate for the scarcity of data, it has been less explored in the point cloud literature.
Ranked #11 on Point Cloud Classification on PointCloud-C
no code implementations • 8 Apr 2021 • Seo Taek Kong, Soomin Jeon, Dongbin Na, Jaewon Lee, Hong-Seok Lee, Kyu-Hwan Jung
Although unlabeled data is readily available in pool-based AL, AL algorithms are usually evaluated by measuring the increase in supervised learning (SL) performance at consecutive acquisition steps.
no code implementations • 1 Jan 2021 • Seo Taek Kong, Soomin Jeon, Jaewon Lee, Hong-Seok Lee, Kyu-Hwan Jung
We name this AL scheme convergence rate control (CRC), and our experiments show that a deep neural network trained using a combination of CRC and a recently proposed SSL algorithm can quickly achieve high performance using far less labeled samples than SL.
no code implementations • 23 Jul 2020 • Kyungmin Lee, Chiyoun Park, Ilhwan Kim, Namhoon Kim, Jaewon Lee
Recurrent Neural Network Language Models (RNNLMs) have started to be used in various fields of speech recognition due to their outstanding performance.
no code implementations • 30 Jan 2018 • Kyungmin Lee, Chiyoun Park, Namhoon Kim, Jaewon Lee
This paper presents methods to accelerate recurrent neural network based language models (RNNLMs) for online speech recognition systems.