no code implementations • 25 Mar 2021 • Ruiqing Yan, Lanchang Sun, Fang Wang, XiaoMing Zhang
Though pre-trained language models such as Bert and XLNet, have rapidly advanced the state-of-the-art on many NLP tasks, they implicit semantics only relying on surface information between words in corpus.