In Conclusion Not Repetition: Comprehensive Abstractive Summarization With Diversified Attention Based On Determinantal Point Processes
点击次数:
所属单位:
北京邮电大学计算机学院
教研室:
智能科学与技术中心
发表刊物:
Proceedings of the 23rd Conference on Computational Natural Language Learning
关键字:
abstractive summarization
摘要:
Various Seq2Seq learning models designed
for machine translation were applied for abstractive summarization task recently. Despite
these models provide high ROUGE scores,
they are limited to generate comprehensive
summaries with a high level of abstraction
due to its degenerated attention distribution.
We introduce Diverse Convolutional Seq2Seq
Model(DivCNN Seq2Seq) using Determinantal Point Processes methods(Micro DPPs and
Macro DPPs) to produce attention distribution
considering both quality and diversity.
论文类型:
论文集
第一作者:
李蕾
合写作者:
Marina Litvak,Natalia Vanetik,黄祖莹
通讯作者:
刘维
学科门类:
工学
一级学科:
计算机科学与技术*
文献类型:
C
页面范围:
822–832
字数:
6000
是否译文:
否
发表时间:
2019-11-03