no code implementations • 30 Jan 2024 • AbdElRahman ElSaid
Crafting neural network architectures manually is a formidable challenge often leading to suboptimal and inefficient structures.
1 code implementation • 11 May 2023 • AbdElRahman ElSaid, Karl Ricanek, Zeming Lyu, Alexander Ororbia, Travis Desell
Continuous Ant-based Topology Search (CANTS) is a previously introduced novel nature-inspired neural architecture search (NAS) algorithm that is based on ant colony optimization (ACO).
no code implementations • 21 Apr 2022 • Aizaz Ul Haq, Niranjana Deshpande, AbdElRahman ElSaid, Travis Desell, Daniel E. Krutz
Simulations using 52, 106 tactic records demonstrate that: I) eRNN is an effective prediction mechanism, II) TVA-E represents an improvement over existing state-of-the-art processes in accounting for tactic volatility, and III) Uncertainty reduction tactics are beneficial in accounting for tactic volatility.
no code implementations • 21 Nov 2020 • AbdElRahman ElSaid, Joshua Karns, Zimeng Lyu, Alexander Ororbia, Travis Desell
This work introduces a novel, nature-inspired neural architecture search (NAS) algorithm based on ant colony optimization, Continuous Ant-based Neural Topology Search (CANTS), which utilizes synthetic ants that move over a continuous search space based on the density and distribution of pheromones, is strongly inspired by how ants move in the real world.
no code implementations • 21 Sep 2020 • Zimeng Lyu, AbdElRahman ElSaid, Joshua Karns, Mohamed Mkaouer, Travis Desell
Weight initialization is critical in being able to successfully train artificial neural networks (ANNs), and even more so for recurrent neural networks (RNNs) which can easily suffer from vanishing and exploding gradients.
no code implementations • 4 Jun 2020 • AbdElRahman ElSaid, Joshua Karns, Alexander Ororbia II, Daniel Krutz, Zimeng Lyu, Travis Desell
Transfer learning entails taking an artificial neural network (ANN) that is trained on a source dataset and adapting it to a new target dataset.
no code implementations • 15 May 2020 • Zimeng Lyu, Joshua Karns, AbdElRahman ElSaid, Travis Desell
This island based strategy is additionally compared to NEAT's (NeuroEvolution of Augmenting Topologies) speciation strategy.
no code implementations • 10 Oct 2017 • AbdElRahman ElSaid, Travis Desell, Fatima El Jamiy, James Higgins, Brandon Wild
This research improves the performance of the most effective LSTM network design proposed in the previous work by using a promising neuroevolution method based on ant colony optimization (ACO) to develop and enhance the LSTM cell structure of the network.