no code implementations • 10 May 2020 • Alexis Asseman, Nicolas Antoine, Ahmet S. Ozcan
Recently, alternative approaches such as evolutionary strategies and deep neuroevolution demonstrated competitive results with faster training time on distributed CPU cores.
no code implementations • 2 Dec 2019 • J. Campbell Scott, Thomas F. Hayes, Ahmet S. Ozcan, Winfried W. Wilcke
Artificial neural networks have diverged far from their early inspiration in neurology.
no code implementations • 27 Nov 2019 • T. S. Jayram, Vincent Marois, Tomasz Kornuta, Vincent Albouy, Emre Sevgen, Ahmet S. Ozcan
Transfer learning has become the de facto standard in computer vision and natural language processing, especially where labeled data is scarce.
no code implementations • 28 May 2019 • Tomasz Kornuta, Deepta Rajan, Chaitanya Shivade, Alexis Asseman, Ahmet S. Ozcan
In this working notes paper, we describe IBM Research AI (Almaden) team's participation in the ImageCLEF 2019 VQA-Med competition.
no code implementations • 15 Nov 2018 • Vincent Marois, T. S. Jayram, Vincent Albouy, Tomasz Kornuta, Younes Bouhadjar, Ahmet S. Ozcan
We introduce a variant of the MAC model (Hudson and Manning, ICLR 2018) with a simplified set of equations that achieves comparable accuracy, while training faster.
no code implementations • 28 Sep 2018 • T. S. Jayram, Younes Bouhadjar, Ryan L. McAvoy, Tomasz Kornuta, Alexis Asseman, Kamil Rocki, Ahmet S. Ozcan
Typical neural networks with external memory do not effectively separate capacity for episodic and working memory as is required for reasoning in humans.
no code implementations • 28 Sep 2018 • T. S. Jayram, Tomasz Kornuta, Ryan L. McAvoy, Ahmet S. Ozcan
We propose a new architecture called Memory-Augmented Encoder-Solver (MAES) that enables transfer learning to solve complex working memory tasks adapted from cognitive psychology.