no code implementations • 21 Mar 2024 • Jinyung Hong, Eun Som Jeon, Changhoon Kim, Keun Hee Park, Utkarsh Nath, Yezhou Yang, Pavan Turaga, Theodore P. Pavlic
Biased attributes, spuriously correlated with target labels in a dataset, can problematically lead to neural networks that learn improper shortcuts for classifications and limit their capabilities for out-of-distribution (OOD) generalization.
1 code implementation • 17 Nov 2023 • Jinyung Hong, Theodore P. Pavlic
Geometric Sensitive Hashing functions, a family of Local Sensitive Hashing functions, are neural network models that learn class-specific manifold geometry in supervised learning.
2 code implementations • 25 May 2023 • Jinyung Hong, Keun Hee Park, Theodore P. Pavlic
Many interpretable AI approaches have been proposed to provide plausible explanations for a model's decision-making.
1 code implementation • 8 Apr 2022 • Jinyung Hong, Theodore P. Pavlic
Existing Continual Learning (CL) approaches have focused on addressing catastrophic forgetting by leveraging regularization methods, replay buffers, and task-specific components.
2 code implementations • AAAI Workshop CLeaR 2022 • Jinyung Hong, Theodore P. Pavlic
Furthermore, background knowledge represented by RWFNs can be used to alleviate the incompleteness of training sets even though the space complexity of RWFNs is much smaller than LTNs (1:27 ratio).
3 code implementations • 11 Sep 2021 • Jinyung Hong, Theodore P. Pavlic
We demonstrate that compared to LTNs, RWFNs can achieve better or similar performance for both object classification and detection of the part-of relations between objects in SII tasks while using much far fewer learnable parameters (1:62 ratio) and a faster learning process (1:2 ratio of running speed).
no code implementations • 17 Aug 2021 • Jinyung Hong, Theodore P. Pavlic
Fruit flies are established model systems for studying olfactory learning as they will readily learn to associate odors with both electric shock or sugar rewards.
no code implementations • 1 Jun 2020 • Jinyung Hong, Theodore P. Pavlic
Neural Tensor Networks (NTNs), which are structured to encode the degree of relationship among pairs of entities, are used in Logic Tensor Networks (LTNs) to facilitate Statistical Relational Learning (SRL) in first-order logic.