Search Results for author: Mingye Gao

Found 5 papers, 5 papers with code

Cross-Care: Assessing the Healthcare Implications of Pre-training Data on Language Model Bias

1 code implementation9 May 2024 Shan Chen, Jack Gallifant, Mingye Gao, Pedro Moreira, Nikolaj Munch, Ajay Muthukkumar, Arvind Rajan, Jaya Kolluri, Amelia Fiske, Janna Hastings, Hugo Aerts, Brian Anthony, Leo Anthony Celi, William G. La Cava, Danielle S. Bitterman

Large language models (LLMs) are increasingly essential in processing natural languages, yet their application is frequently compromised by biases and inaccuracies originating in their training data.

Augmenting x-ray single particle imaging reconstruction with self-supervised machine learning

1 code implementation28 Nov 2023 Zhantao Chen, Cong Wang, Mingye Gao, Chun Hong Yoon, Jana B. Thayer, Joshua J. Turner

The development of X-ray Free Electron Lasers (XFELs) has opened numerous opportunities to probe atomic structure and ultrafast dynamics of various materials.

Cooperative Self-training of Machine Reading Comprehension

1 code implementation NAACL 2022 Hongyin Luo, Shang-Wen Li, Mingye Gao, Seunghak Yu, James Glass

Pretrained language models have significantly improved the performance of downstream language understanding tasks, including extractive question answering, by providing high-quality contextualized word embeddings.

Extractive Question-Answering Machine Reading Comprehension +6

Cannot find the paper you are looking for? You can Submit a new open access paper.