2 code implementations • 16 Oct 2019 • David Donahue, Vladislav Lialin, Anna Rumshisky
The Transformer architecture has become increasingly popular over the past two years, owing to its impressive performance on a number of natural language processing (NLP) tasks.
no code implementations • 16 Oct 2019 • David Donahue, Yuanliang Meng, Anna Rumshisky
The first design features a sequence-to-sequence architecture with two separate NTM modules, one for each participant in the conversation.
no code implementations • 16 Oct 2019 • David Donahue
Recent state-of-the-art video generation systems employ Generative Adversarial Networks (GANs) or Variational Autoencoders (VAEs) to produce novel videos.
no code implementations • 11 Oct 2018 • David Donahue, Anna Rumshisky
This is largely because sequences of text are discrete, and thus gradients cannot propagate from the discriminator to the generator.
2 code implementations • NAACL 2019 • Alexey Romanov, Anna Rumshisky, Anna Rogers, David Donahue
We show that the proposed method is capable of fine-grained controlled change of these aspects of the input sentence.
no code implementations • SEMEVAL 2017 • David Donahue, Alexey Romanov, Anna Rumshisky
This paper describes the winning system for SemEval-2017 Task 6: {\#}HashtagWars: Learning a Sense of Humor.