Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies

9 Dec 2020  ·  Xingran Zhu ·

Cross-lingual word sense disambiguation (WSD) tackles the challenge of disambiguating ambiguous words across languages given context. The pre-trained BERT embedding model has been proven to be effective in extracting contextual information of words, and have been incorporated as features into many state-of-the-art WSD systems. In order to investigate how syntactic information can be added into the BERT embeddings to result in both semantics- and syntax-incorporated word embeddings, this project proposes the concatenated embeddings by producing dependency parse tress and encoding the relative relationships of words into the input embeddings. Two methods are also proposed to reduce the size of the concatenated embeddings. The experimental results show that the high dimensionality of the syntax-incorporated embeddings constitute an obstacle for the classification task, which needs to be further addressed in future studies.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods