no code implementations • 9 Nov 2020 • Amanda Rios, Laurent Itti
Supervised deep neural networks are known to undergo a sharp decline in the accuracy of older tasks when new tasks are learned, termed "catastrophic forgetting".
no code implementations • 27 Sep 2020 • Shixian Wen, Amanda Rios, Yunhao Ge, Laurent Itti
Continual learning of multiple tasks in artificial neural networks using gradient descent leads to catastrophic forgetting, whereby a previously learned mapping of an old task is erased when learning new mappings for new tasks.
no code implementations • 27 Sep 2020 • Shixian Wen, Amanda Rios, Laurent Itti
The reason is that neural networks fail to accommodate the distribution drift of the input data caused by adversarial perturbations.
no code implementations • 3 Nov 2018 • Amanda Rios, Laurent Itti
Sequential learning of tasks using gradient descent leads to an unremitting decline in the accuracy of tasks for which training data is no longer available, termed catastrophic forgetting.