Japanese Word Segmentation
3 papers with code • 1 benchmarks • 0 datasets
This task has no description! Would you like to contribute one?
Most implemented papers
LATTE: Lattice ATTentive Encoding for Character-based Word Segmentation
Our model employs the lattice structure to handle segmentation alternatives and utilizes graph neural networks along with an attention mechanism to attentively extract multi-granularity representation from the lattice for complementing character representations.
Incorporating Word Attention into Character-Based Word Segmentation
Neural network models have been actively applied to word segmentation, especially Chinese, because of the ability to minimize the effort in feature engineering.