1 code implementation • 19 Feb 2024 • Zihan Qiu, Zeyu Huang, Youcheng Huang, Jie Fu
The feed-forward networks (FFNs) in transformers are recognized as a group of key-value neural memories to restore abstract high-level knowledge.
no code implementations • 15 Jan 2024 • Youcheng Huang, Wenqiang Lei, Zheng Zhang, Jiancheng Lv, Shuicheng Yan
In this paper, we empirically find that the effects of different contexts upon LLMs in recalling the same knowledge follow a Gaussian-like distribution.
1 code implementation • 7 Nov 2022 • Youcheng Huang, Wenqiang Lei, Jie Fu, Jiancheng Lv
Incorporating large-scale pre-trained models with the prototypical neural networks is a de-facto paradigm in few-shot named entity recognition.
1 code implementation • ACL 2021 • Fengbin Zhu, Wenqiang Lei, Youcheng Huang, Chao Wang, Shuo Zhang, Jiancheng Lv, Fuli Feng, Tat-Seng Chua
In this work, we extract samples from real financial reports to build a new large-scale QA dataset containing both Tabular And Textual data, named TAT-QA, where numerical reasoning is usually required to infer the answer, such as addition, subtraction, multiplication, division, counting, comparison/sorting, and the compositions.
Ranked #1 on Question Answering on TAT-QA
no code implementations • 27 Apr 2020 • Youcheng Huang, Tangchen Wei, Jundong Zhou, Chunxin Yang
In this paper, we study how to solve these conflicts on generative models based on the conditional variational autoencoder(CVAE) model.