1 code implementation • 17 Jan 2024 • Trung Quoc Luong, Xinbo Zhang, Zhanming Jie, Peng Sun, Xiaoran Jin, Hang Li
ReFT first warmups the model with SFT, and then employs on-line reinforcement learning, specifically the PPO algorithm in this paper, to further fine-tune the model, where an abundance of reasoning paths are automatically sampled given the question and the rewards are naturally derived from the ground-truth answers.
1 code implementation • 20 Sep 2023 • Zhanming Jie, Trung Quoc Luong, Xinbo Zhang, Xiaoran Jin, Hang Li
We also find that Python is a better choice of language than Wolfram for program CoTs.
1 code implementation • 29 May 2023 • Zhanming Jie, Wei Lu
To address these issues, we investigate two approaches to leverage the training data in a few-shot prompting scenario: dynamic program prompting and program distillation.
no code implementations • 16 May 2023 • Shuwei Feng, Tianyang Zhan, Zhanming Jie, Trung Quoc Luong, Xiaoran Jin
This paper presents GenDoc, a general sequence-to-sequence document understanding model pre-trained with unified masking across three modalities: text, image, and layout.
1 code implementation • ACL 2022 • Zhanming Jie, Jierui Li, Wei Lu
Solving math word problems requires deductive reasoning over the quantities in the text.
Ranked #4 on Math Word Problem Solving on MathQA
1 code implementation • EMNLP 2021 • Yuxiang Zhou, Lejian Liao, Yang Gao, Zhanming Jie, Wei Lu
Dependency parse trees are helpful for discovering the opinion words in aspect-based sentiment analysis (ABSA).
Aspect-Based Sentiment Analysis Aspect-Based Sentiment Analysis (ABSA)
1 code implementation • NAACL 2021 • Lu Xu, Zhanming Jie, Wei Lu, Lidong Bing
We believe this is because both types of features - the contextual information captured by the linear sequences and the structured information captured by the dependency trees may complement each other.
1 code implementation • EMNLP 2020 • Liying Cheng, Dekun Wu, Lidong Bing, Yan Zhang, Zhanming Jie, Wei Lu, Luo Si
Previous works on knowledge-to-text generation take as input a few RDF triples or key-value pairs conveying the knowledge of some entities to generate a natural language description.
Ranked #1 on KG-to-Text Generation on ENT-DESC
1 code implementation • IJCNLP 2019 • Zhanming Jie, Wei Lu
Dependency tree structures capture long-distance and syntactic relationships between words in a sentence.
Ranked #1 on Chinese Named Entity Recognition on OntoNotes 5.0
Chinese Named Entity Recognition named-entity-recognition +2
no code implementations • NAACL 2019 • Zhanming Jie, Pengjun Xie, Wei Lu, Ruixue Ding, Linlin Li
Supervised approaches to named entity recognition (NER) are largely developed based on the assumption that the training data is fully annotated with named entity information.
1 code implementation • 19 Oct 2018 • Zhanming Jie, Aldrian Obaja Muis, Wei Lu
It has been shown previously that such information can be used to improve the performance of NER (Sasano and Kurohashi 2008, Ling and Weld 2012).
no code implementations • EMNLP 2018 • Zhanming Jie, Wei Lu
We propose a novel dependency-based hybrid tree model for semantic parsing, which converts natural language utterance into machine interpretable meaning representations.