no code implementations • 29 Apr 2024 • Donggyun Kim, Seongwoong Cho, Semin Kim, Chong Luo, Seunghoon Hong
In this study, we explore a universal model that can flexibly adapt to unseen dense label structures with a few examples, enabling it to serve as a data-efficient vision generalist in diverse real-world scenarios.
1 code implementation • 27 Mar 2023 • Donggyun Kim, Jinwoo Kim, Seongwoong Cho, Chong Luo, Seunghoon Hong
We propose Visual Token Matching (VTM), a universal few-shot learner for arbitrary dense prediction tasks.
1 code implementation • 28 Oct 2021 • Donggyun Kim, Seongwoong Cho, Wonkwang Lee, Seunghoon Hong
To this end, we propose Multi-Task Neural Processes (MTNPs), an extension of NPs designed to jointly infer tasks realized from multiple stochastic processes.
no code implementations • ICLR 2022 • Donggyun Kim, Seongwoong Cho, Wonkwang Lee, Seunghoon Hong
Neural Processes (NPs) consider a task as a function realized from a stochastic process and flexibly adapt to unseen tasks through inference on functions.