no code implementations • 3 Jun 2023 • Christopher Michael Rytting, Taylor Sorensen, Lisa Argyle, Ethan Busby, Nancy Fulda, Joshua Gubler, David Wingate
This provides exciting evidence that language models can serve as a critical advance in the coding of open-ended texts in a variety of applications.
no code implementations • 14 Feb 2023 • Lisa P. Argyle, Ethan Busby, Joshua Gubler, Chris Bail, Thomas Howe, Christopher Rytting, David Wingate
A rapidly increasing amount of human conversation occurs online.
1 code implementation • 22 Oct 2022 • Joshua Robinson, Christopher Michael Rytting, David Wingate
A more natural prompting approach is to present the question and answer options to the LLM jointly and have it output the symbol (e. g., "A") associated with its chosen answer option.
no code implementations • 6 Oct 2022 • David Wingate, Mohammad Shoeybi, Taylor Sorensen
We explore the idea of compressing the prompts used to condition language models, and show that compressed prompts can retain a substantive amount of information about the original prompt.
no code implementations • 14 Sep 2022 • Lisa P. Argyle, Ethan C. Busby, Nancy Fulda, Joshua Gubler, Christopher Rytting, David Wingate
We propose and explore the possibility that language models can be studied as effective proxies for specific human sub-populations in social science research.
no code implementations • ACL 2022 • Taylor Sorensen, Joshua Robinson, Christopher Michael Rytting, Alexander Glenn Shaw, Kyle Jeffrey Rogers, Alexia Pauline Delorey, Mahmoud Khalil, Nancy Fulda, David Wingate
Pre-trained language models derive substantial linguistic and factual knowledge from the massive corpora on which they are trained, and prompt engineering seeks to align these models to specific tasks.
no code implementations • NeurIPS 2021 • Christopher Michael Rytting, David Wingate
Large natural language models (such as GPT-3 or T5) demonstrate impressive abilities across a range of general NLP tasks.
1 code implementation • NeurIPS 2020 • Zachary C. Brown, Nathaniel Robinson, David Wingate, Nancy Fulda
It is notoriously difficult to control the behavior of artificial neural networks such as generative neural language models.
no code implementations • 3 Jan 2020 • Erich Mielke, Eric Townsend, David Wingate, Marc D. Killpack
We show that our human-human dyad data has interesting trends including that interaction forces are non-negligible compared to the force required to accelerate an object and that the beginning of a lateral movement is characterized by distinct torque triggers from the leader of the dyad.
no code implementations • 3 Oct 2019 • Kolby Nottingham, Anand Balakrishnan, Jyotirmoy Deshmukh, David Wingate
We propose using propositional logic to specify the importance of multiple objectives.
Multi-Objective Reinforcement Learning reinforcement-learning
no code implementations • 1 Oct 2019 • Andrew Carr, Jared Nielsen, David Wingate
Neural Processes (NPs) are a class of models that learn a mapping from a context set of input-output pairs to a distribution over functions.
no code implementations • 1 Mar 2019 • Robert Pottorff, Jared Nielsen, David Wingate
We predict future video frames from complex dynamic scenes, using an invertible neural network as the encoder of a nonlinear dynamic system with latent linear state evolution.
no code implementations • 26 Feb 2019 • Andrew Carr, David Wingate
We introduce Graph Neural Processes (GNP), inspired by the recent work in conditional and latent neural processes.
no code implementations • 4 Dec 2018 • Iris Rubi Seaman, Jan-Willem van de Meent, David Wingate
As autonomous agents become more ubiquitous, they will eventually have to reason about the plans of other agents, which is known as theory of mind reasoning.
no code implementations • 14 Aug 2018 • David Wingate, William Myers, Nancy Fulda, Tyler Etchart
Classic grammars and regular expressions can be used for a variety of purposes, including parsing, intent detection, and matching.
1 code implementation • 17 Apr 2017 • Marco F. Cusumano-Towner, Alexey Radul, David Wingate, Vikash K. Mansinghka
Intelligent systems sometimes need to infer the probable goals of people, cars, and robots, based on partial observations of their motion.
no code implementations • 9 Mar 2017 • Nancy Fulda, Daniel Ricks, Ben Murdoch, David Wingate
Autonomous agents must often detect affordances: the set of behaviors enabled by a situation.
no code implementations • 7 Jan 2013 • David Wingate, Theophane Weber
We present a new algorithm for approximate inference in probabilistic programs, based on a stochastic gradient for variational programs.
no code implementations • NeurIPS 2011 • David Wingate, Noah Goodman, Andreas Stuhlmueller, Jeffrey M. Siskind
Probabilistic programming languages allow modelers to specify a stochastic process using syntax that resembles modern programming languages.
no code implementations • NeurIPS 2010 • Finale Doshi-Velez, David Wingate, Nicholas Roy, Joshua B. Tenenbaum
We consider reinforcement learning in partially observable domains where the agent can query an expert for demonstrations.