Paper

Good, Better, Best: Choosing Word Embedding Context

We propose two methods of learning vector representations of words and phrases that each combine sentence context with structural features extracted from dependency trees. Using several variations of neural network classifier, we show that these combined methods lead to improved performance when used as input features for supervised term-matching.

Results in Papers With Code
(↓ scroll down to see all results)