Abstractive summarization of hospitalisation histories with transformer networks
In this paper we present a novel approach to abstractive summarization of patient hospitalisation histories. We applied an encoder-decoder framework with Longformer neural network as an encoder and BERT as a decoder. Our experiments show improved quality on some summarization tasks compared with pointer-generator networks. We also conducted a study with experienced physicians evaluating the results of our model in comparison with PGN baseline and human-generated abstracts, which showed the effectiveness of our model.
PDF AbstractDatasets
Add Datasets
introduced or used in this paper
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
AdamW •
Attention Dropout •
BERT •
Dense Connections •
Dilated Sliding Window Attention •
Dropout •
GELU •
Global and Sliding Window Attention •
Layer Normalization •
Linear Layer •
Linear Warmup With Linear Decay •
Longformer •
Multi-Head Attention •
Residual Connection •
Scaled Dot-Product Attention •
Sliding Window Attention •
Softmax •
Weight Decay •
WordPiece