no code implementations • 1 Apr 2024 • Zheng Zhang, Fan Yang, Ziyan Jiang, Zheng Chen, Zhengyang Zhao, Chengyuan Ma, Liang Zhao, Yang Liu
Recent advances in large language models (LLMs) have enhanced their ability to process long input contexts.
1 code implementation • 6 Jan 2024 • Zilin Huang, Zihao Sheng, Chengyuan Ma, Sikai Chen
In this paradigm, the human expert serves as a mentor to the AI agent.
no code implementations • 2 May 2023 • Yichuan Li, Jialong Han, Kyumin Lee, Chengyuan Ma, Benjamin Yao, Derek Liu
In recent years, Pre-trained Language Models (PLMs) have shown their superiority by pre-training on unstructured text corpus and then fine-tuning on downstream tasks.
no code implementations • 26 Feb 2023 • Ruolin Su, Zhongkai Sun, Sixing Lu, Chengyuan Ma, Chenlei Guo
Recent advances in cross-lingual commonsense reasoning (CSR) are facilitated by the development of multilingual pre-trained models (mPTMs).
no code implementations • NAACL (ACL) 2022 • Pragaash Ponnusamy, Clint Solomon Mathialagan, Gustavo Aguilar, Chengyuan Ma, Chenlei Guo
Self-learning paradigms in large-scale conversational AI agents tend to leverage user feedback in bridging between what they say and what they mean.
no code implementations • RepL4NLP (ACL) 2022 • Md Mofijul Islam, Gustavo Aguilar, Pragaash Ponnusamy, Clint Solomon Mathialagan, Chengyuan Ma, Chenlei Guo
Additionally, the dependency on a fixed vocabulary limits the subword models' adaptability across languages and domains.
no code implementations • 22 Feb 2022 • Zhongkai Sun, Sixing Lu, Chengyuan Ma, Xiaohu Liu, Chenlei Guo
However, these methods rarely focus on query expansion and entity weighting simultaneously, which may limit the scope and accuracy of the query reformulation retrieval.
no code implementations • 13 Feb 2022 • Ruixue Lian, Che-Wei Huang, Yuqing Tang, Qilong Gu, Chengyuan Ma, Chenlei Guo
Individual user profiles and interaction histories play a significant role in providing customized experiences in real-world applications such as chatbots, social media, retail, and education.
1 code implementation • ICON 2021 • Haoran Xu, Sixing Lu, Zhongkai Sun, Chengyuan Ma, Chenlei Guo
Text Style Transfer (TST) aims to alter the underlying style of the source text to another specific style while keeping the same content.
no code implementations • 20 Sep 2018 • Zeynab Raeesy, Kellen Gillespie, Zhenpei Yang, Chengyuan Ma, Thomas Drugman, Jiacheng Gu, Roland Maas, Ariya Rastrow, Björn Hoffmeister
We prove that, with enough data, the LSTM model is indeed as capable of learning whisper characteristics from LFBE features alone compared to a simpler MLP model that uses both LFBE and features engineered for separating whisper and normal speech.