1 code implementation • COLING 2022 • Xiao Ding, Bowen Chen, Li Du, Bing Qin, Ting Liu
To fill the gap, we propose CogBERT, a framework that can induce fine-grained cognitive features from cognitive data and incorporate cognitive features into BERT by adaptively adjusting the weight of cognitive features for different NLP tasks.
1 code implementation • EMNLP 2021 • Jihao Shi, Xiao Ding, Li Du, Ting Liu, Bing Qin
Many open-domain question answering problems can be cast as a textual entailment task, where a question and candidate answers are concatenated to form hypotheses.
1 code implementation • 2 Apr 2024 • Zhouhao Sun, Xiao Ding, Li Du, Bibo Cai, Jinglong Gao, Ting Liu, Qin Bing
To address this issue, we propose a novel framework, named Generalizable and Faithful Reasoner (GFaiR), which introduces the paradigm of resolution refutation.
1 code implementation • 25 Mar 2024 • Yirong Zeng, Xiao Ding, Yi Zhao, Xiangyu Li, Jie Zhang, Chao Yao, Ting Liu, Bing Qin
Furthermore, we construct RU22Fact, a novel multilingual explainable fact-checking dataset on the Russia-Ukraine conflict in 2022 of 16K samples, each containing real-world claims, optimized evidence, and referenced explanation.
no code implementations • 14 Mar 2024 • Kai Xiong, Xiao Ding, Ting Liu, Bing Qin, Dongliang Xu, Qing Yang, Hongtao Liu, Yixin Cao
Large language models (LLMs) have developed impressive performance and strong explainability across various reasoning scenarios, marking a significant stride towards mimicking human-like intelligence.
no code implementations • 18 Feb 2024 • Yang Zhao, Li Du, Xiao Ding, Kai Xiong, Zhouhao Sun, Jun Shi, Ting Liu, Bing Qin
Through pretraining on a corpus with various sources, Large Language Models (LLMs) have gained impressive performance.
1 code implementation • 13 Feb 2024 • Mohammad Ghazi Vakili, Christoph Gorgulla, AkshatKumar Nigam, Dmitry Bezrukov, Daniel Varoli, Alex Aliper, Daniil Polykovsky, Krishna M. Padmanabha Das, Jamie Snider, Anna Lyakisheva, Ardalan Hosseini Mansob, Zhong Yao, Lela Bitar, Eugene Radchenko, Xiao Ding, Jinxin Liu, Fanye Meng, Feng Ren, Yudong Cao, Igor Stagljar, Alán Aspuru-Guzik, Alex Zhavoronkov
The discovery of small molecules with therapeutic potential is a long-standing challenge in chemistry and biology.
no code implementations • 20 Jun 2023 • Linyao Yang, Hongyang Chen, Zhao Li, Xiao Ding, Xindong Wu
Recently, ChatGPT, a representative large language model (LLM), has gained considerable attention due to its powerful emergent abilities.
1 code implementation • 19 May 2023 • Kai Xiong, Xiao Ding, Yixin Cao, Ting Liu, Bing Qin
Through extensive experiments on various datasets, LLMs can effectively collaborate to reach a consensus despite noticeable inter-inconsistencies, but imbalances in their abilities can lead to domination by superior LLMs.
1 code implementation • 18 May 2023 • Tingting Wu, Xiao Ding, Minji Tang, Hao Zhang, Bing Qin, Ting Liu
To mitigate the effects of label noise, learning with noisy labels (LNL) methods are designed to achieve better generalization performance.
1 code implementation • 12 May 2023 • Jinglong Gao, Xiao Ding, Bing Qin, Ting Liu
Causal reasoning ability is crucial for numerous NLP applications.
1 code implementation • 16 Dec 2022 • Kai Xiong, Xiao Ding, Zhongyang Li, Li Du, Bing Qin, Yi Zheng, Baoxing Huai
Causal chain reasoning (CCR) is an essential ability for many decision-making AI systems, which requires the model to build reliable causal chains by connecting causal pairs.
no code implementations • 21 Aug 2022 • Tingting Wu, Xiao Ding, Hao Zhang, Jinglong Gao, Li Du, Bing Qin, Ting Liu
To relieve this issue, curriculum learning is proposed to improve model performance and generalization by ordering training samples in a meaningful (e. g., easy to hard) sequence.
no code implementations • 14 Aug 2022 • Bowen Chen, Xiao Ding, Li Du, Qin Bing, Ting Liu
Given a task, human learns from easy to hard, whereas the model learns randomly.
no code implementations • Findings (ACL) 2022 • Li Du, Xiao Ding, Yue Zhang, Kai Xiong, Ting Liu, Bing Qin
To this end, we incorporate an additional structured variable into BERT to learn to predict the event connections in the training process.
no code implementations • 21 Jan 2022 • Feng Ren, Xiao Ding, Min Zheng, Mikhail Korzinkin, Xin Cai, Wei Zhu, Alexey Mantsyzov, Alex Aliper, Vladimir Aladinskiy, Zhongying Cao, Shanshan Kong, Xi Long, Bonnie Hei Man Liu, Yingtao Liu, Vladimir Naumov, Anastasia Shneyderman, Ivan V. Ozerov, Ju Wang, Frank W. Pun, Alan Aspuru-Guzik, Michael Levitt, Alex Zhavoronkov
The AlphaFold computer program predicted protein structures for the whole human genome, which has been considered as a remarkable breakthrough both in artificial intelligence (AI) application and structural biology.
1 code implementation • ACL 2021 • Li Du, Xiao Ding, Kai Xiong, Ting Liu, Bing Qin
ExCAR first acquires additional evidence information from a large-scale causal event graph as logical rules for causal reasoning.
1 code implementation • ACL 2021 • Li Du, Xiao Ding, Ting Liu, Bing Qin
Abductive reasoning aims at inferring the most plausible explanation for observed events, which would play critical roles in various NLP applications, such as reading comprehension and question answering.
no code implementations • 21 Jul 2021 • Zhongyang Li, Xiao Ding, Ting Liu, J. Edward Hu, Benjamin Van Durme
We present a conditional text generation framework that posits sentential expressions of possible causes and effects.
no code implementations • 21 Jul 2021 • Zhongyang Li, Xiao Ding, Kuo Liao, Bing Qin, Ting Liu
Recent work has shown success in incorporating pre-trained models like BERT to improve NLP systems.
no code implementations • SEMEVAL 2020 • Xiao Ding, Dingkui Hao, Yuewei Zhang, Kuo Liao, Zhongyang Li, Bing Qin, Ting Liu
In this task, we dedicate to detecting causation, especially counterfactuals from texts.
no code implementations • IJCNLP 2019 • Li Du, Xiao Ding, Ting Liu, Zhongyang Li
Understanding event and event-centered commonsense reasoning are crucial for natural language processing (NLP).
1 code implementation • IJCNLP 2019 • Xiao Ding, Kuo Liao, Ting Liu, Zhongyang Li, Junwen Duan
Prior work has proposed effective methods to learn event representations that can capture syntactic and semantic information over text corpus, demonstrating their effectiveness for downstream tasks such as script event prediction.
no code implementations • 18 Jul 2019 • Xiao Ding, Zhongyang Li, Ting Liu, Kuo Liao
The evolution and development of events have their own basic principles, which make events happen sequentially.
1 code implementation • 17 May 2019 • Zhongyang Li, Xiao Ding, Ting Liu
In this study, we investigate a transferable BERT (TransBERT) training framework, which can transfer not only general language knowledge from large-scale unlabeled data but also specific kinds of knowledge from various semantically related supervised tasks, for a target task.
no code implementations • COLING 2018 • Zhongyang Li, Xiao Ding, Ting Liu
In this paper, we propose using adversarial training augmented Seq2Seq model to generate reasonable and diversified story endings given a story context.
1 code implementation • COLING 2018 • Junwen Duan, Yue Zhang, Xiao Ding, Ching-Yun Chang, Ting Liu
The model uses a target-sensitive representation of the news abstract to weigh sentences in the news content, so as to select and combine the most informative sentences for market modeling.
no code implementations • NAACL 2018 • Junwen Duan, Xiao Ding, Ting Liu
To address above issues, we propose a reinforcement learning based approach, which automatically induces target-specific sentence representations over tree structures.
1 code implementation • 14 May 2018 • Zhongyang Li, Xiao Ding, Ting Liu
Script event prediction requires a model to predict the subsequent event given an existing event context.
no code implementations • COLING 2016 • Xiao Ding, Yue Zhang, Ting Liu, Junwen Duan
Representing structured events as vectors in continuous space offers a new way for defining dense features for natural language processing (NLP) applications.
no code implementations • 10 Oct 2016 • Xiaofei Sun, Jiang Guo, Xiao Ding, Ting Liu
This paper investigates the problem of network embedding, which aims at learning low-dimensional vector representation of nodes in networks.