Search Results for author: Ahmet S. Ozcan

Found 7 papers, 0 papers with code

Accelerating Deep Neuroevolution on Distributed FPGAs for Reinforcement Learning Problems

no code implementations10 May 2020 Alexis Asseman, Nicolas Antoine, Ahmet S. Ozcan

Recently, alternative approaches such as evolutionary strategies and deep neuroevolution demonstrated competitive results with faster training time on distributed CPU cores.

Atari Games Computational Efficiency +2

Simulation of neural function in an artificial Hebbian network

no code implementations2 Dec 2019 J. Campbell Scott, Thomas F. Hayes, Ahmet S. Ozcan, Winfried W. Wilcke

Artificial neural networks have diverged far from their early inspiration in neurology.

Transfer Learning in Visual and Relational Reasoning

no code implementations27 Nov 2019 T. S. Jayram, Vincent Marois, Tomasz Kornuta, Vincent Albouy, Emre Sevgen, Ahmet S. Ozcan

Transfer learning has become the de facto standard in computer vision and natural language processing, especially where labeled data is scarce.

Question Answering Relational Reasoning +3

Leveraging Medical Visual Question Answering with Supporting Facts

no code implementations28 May 2019 Tomasz Kornuta, Deepta Rajan, Chaitanya Shivade, Alexis Asseman, Ahmet S. Ozcan

In this working notes paper, we describe IBM Research AI (Almaden) team's participation in the ImageCLEF 2019 VQA-Med competition.

Medical Visual Question Answering Multi-Task Learning +2

On transfer learning using a MAC model variant

no code implementations15 Nov 2018 Vincent Marois, T. S. Jayram, Vincent Albouy, Tomasz Kornuta, Younes Bouhadjar, Ahmet S. Ozcan

We introduce a variant of the MAC model (Hudson and Manning, ICLR 2018) with a simplified set of equations that achieves comparable accuracy, while training faster.

Transfer Learning

Learning to Remember, Forget and Ignore using Attention Control in Memory

no code implementations28 Sep 2018 T. S. Jayram, Younes Bouhadjar, Ryan L. McAvoy, Tomasz Kornuta, Alexis Asseman, Kamil Rocki, Ahmet S. Ozcan

Typical neural networks with external memory do not effectively separate capacity for episodic and working memory as is required for reasoning in humans.

Using Multi-task and Transfer Learning to Solve Working Memory Tasks

no code implementations28 Sep 2018 T. S. Jayram, Tomasz Kornuta, Ryan L. McAvoy, Ahmet S. Ozcan

We propose a new architecture called Memory-Augmented Encoder-Solver (MAES) that enables transfer learning to solve complex working memory tasks adapted from cognitive psychology.

Multi-Task Learning

Cannot find the paper you are looking for? You can Submit a new open access paper.