no code implementations • LTEDI (ACL) 2022 • Harrison Santiago, Joshua Martin, Sarah Moeller, Kevin Tang
To overcome the scarcity, we employ a combination of rule-based filters and data augmentation that generate a corpus balanced between habitual and non-habitual instances.
no code implementations • EMNLP 2020 • Sarah Moeller, Ling Liu, Changbing Yang, Katharina Kann, Mans Hulden
An intermediate step in the linguistic analysis of an under-documented language is to find and organize inflected forms that are attested in natural speech.
no code implementations • 26 Apr 2022 • Harrison Santiago, Joshua Martin, Sarah Moeller, Kevin Tang
To overcome the scarcity, we employ a combination of rule-based filters and data augmentation that generate a corpus balanced between habitual and non-habitual instances.
no code implementations • ACL 2021 • Sarah Moeller, Ling Liu, Mans Hulden
However, the importance and usefulness of POS tags needs to be examined as NLP expands to low-resource languages because linguists who provide many annotated resources do not place priority on early identification and tagging of POS.
no code implementations • LREC 2020 • Sarah Moeller, Irina Wagner, Martha Palmer, Kathryn Conger, Skatje Myers
This paper presents a proposition bank for Russian (RuPB), a resource for semantic role labeling (SRL).
no code implementations • LREC 2020 • Graham Neubig, Shruti Rijhwani, Alexis Palmer, Jordan MacKenzie, Hilaria Cruz, Xinjian Li, Matthew Lee, Aditi Chaudhary, Luke Gessler, Steven Abney, Shirley Anugrah Hayati, Antonios Anastasopoulos, Olga Zamaraeva, Emily Prud'hommeaux, Jennette Child, Sara Child, Rebecca Knowles, Sarah Moeller, Jeffrey Micher, Yiyuan Li, Sydney Zink, Mengzhou Xia, Roshan S Sharma, Patrick Littell
Despite recent advances in natural language processing and other language technology, the application of such technology to language documentation and conservation has been limited.
no code implementations • CONLL 2019 • Kevin Stowe, Sarah Moeller, Laura Michaelis, Martha Palmer
In the field of metaphor detection, deep learning systems are the ubiquitous and achieve strong performance on many tasks.
no code implementations • COLING 2018 • Sarah Moeller, Ghazaleh Kazeminejad, Andrew Cowell, Mans Hulden
We experiment with training an encoder-decoder neural model for mimicking the behavior of an existing hand-written finite-state morphological grammar for Arapaho verbs, a polysynthetic language with a highly complex verbal inflection system.
no code implementations • COLING 2018 • Sarah Moeller, Mans Hulden
Morphological analysis of morphologically rich and low-resource languages is important to both descriptive linguistics and natural language processing.