1 code implementation • 29 Feb 2024 • Suyuchen Wang, Ivan Kobyzev, Peng Lu, Mehdi Rezagholizadeh, Bang Liu
This paper addresses the challenge of train-short-test-long (TSTL) scenarios in Large Language Models (LLMs) equipped with Rotary Position Embedding (RoPE), where models pre-trained on shorter sequences face difficulty with out-of-distribution (OOD) token positions in longer sequences.
no code implementations • 16 Sep 2023 • Hossein Rajabzadeh, Suyuchen Wang, Hyock Ju Kwon, Bang Liu
We employ a tool-interacting divide-and-conquer strategy enabling large language models (LLMs) to answer complex multimodal multi-hop questions.
1 code implementation • 27 Jan 2021 • Suyuchen Wang, Ruihui Zhao, Xi Chen, Yefeng Zheng, Bang Liu
Taxonomy is a hierarchically structured knowledge graph that plays a crucial role in machine intelligence.