Search Results for author: Jamil Gafur

Found 2 papers, 1 papers with code

Measuring the Energy Consumption and Efficiency of Deep Neural Networks: An Empirical Analysis and Design Recommendations

no code implementations13 Mar 2024 Charles Edison Tripp, Jordan Perr-Sauer, Jamil Gafur, Amabarish Nag, Avi Purkayastha, Sagi Zisman, Erik A. Bensen

Addressing the so-called ``Red-AI'' trend of rising energy consumption by large-scale neural networks, this study investigates the actual energy consumption, as measured by node-level watt-meters, of training various fully connected neural network architectures.

The BUTTER Zone: An Empirical Study of Training Dynamics in Fully Connected Neural Networks

1 code implementation25 Jul 2022 Charles Edison Tripp, Jordan Perr-Sauer, Lucas Hayne, Monte Lunacek, Jamil Gafur

We present an empirical dataset surveying the deep learning phenomenon on fully-connected feed-forward multilayer perceptron neural networks.

Cannot find the paper you are looking for? You can Submit a new open access paper.