1 code implementation • 16 Jun 2023 • Zachary Robertson
In this pilot study, we investigate the use of GPT4 to assist in the peer-review process.
no code implementations • 2 Jun 2023 • Zachary Robertson, Oluwasanmi Koyejo
In the quest to enhance the efficiency and bio-plausibility of training deep neural networks, Feedback Alignment (FA), which replaces the backward pass weights with random matrices in the training process, has emerged as an alternative to traditional backpropagation.
no code implementations • 2 Jun 2023 • Zachary Robertson, Oluwasanmi Koyejo
The mechanism's novelty lies in using pairwise comparisons for eliciting information from the bidder, arguably easier for humans than assigning a numerical value.
no code implementations • 1 Jun 2023 • Boxiang Lyu, Zhe Feng, Zachary Robertson, Sanmi Koyejo
We study the design of loss functions for click-through rates (CTR) to optimize (social) welfare in advertising auctions.
1 code implementation • 24 Mar 2023 • Rylan Schaeffer, Mikail Khona, Zachary Robertson, Akhilan Boopathy, Kateryna Pistunova, Jason W. Rocks, Ila Rani Fiete, Oluwasanmi Koyejo
Double descent is a surprising phenomenon in machine learning, in which as the number of model parameters grows relative to the number of data, test error drops as models grow ever larger into the highly overparameterized (data undersampled) regime.