1 code implementation • 1 Aug 2023 • Jinglei Zhang, Tiancheng Lin, Yi Xu, Kai Chen, Rui Zhang
We argue that such prior contextual information can be interpreted as the relations of textual primitives due to the heterogeneous text and background, which can provide effective self-supervised labels for representation learning.
no code implementations • 29 Dec 2022 • Bo Li, Wei Ye, Jinglei Zhang, Shikun Zhang
Specifically, for a given sample, we build a label graph to review candidate labels in the Top-k prediction set and learn the connections between them.
Ranked #2 on Relation Extraction on TACRED-Revisited
1 code implementation • 29 Dec 2022 • Bo Li, Dingyao Yu, Wei Ye, Jinglei Zhang, Shikun Zhang
Sequence generation demonstrates promising performance in recent information extraction efforts, by incorporating large-scale pre-trained Seq2Seq models.
Ranked #1 on Relation Extraction on sciERC-sent
no code implementations • 17 Feb 2021 • Yasar Atas, Jinglei Zhang, Randy Lewis, Amin Jahanpour, Jan F. Haase, Christine A. Muschik
We realize, for the first time, a non-Abelian gauge theory with both gauge and matter fields on a quantum computer.
Quantum Physics High Energy Physics - Lattice
no code implementations • 24 Feb 2020 • Wei Ye, Rui Xie, Jinglei Zhang, Tianxiang Hu, Xiaoyin Wang, Shikun Zhang
Since both tasks aim to model the association between natural language and programming language, recent studies have combined these two tasks to improve their performance.