Association for Computational Linguistics. In Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, pages 210–220, Copenhagen, Denmark. Affinity-Preserving Random Walk for Multi-Document Summarization. Anthology ID: D17-1020 Volume: Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing Month: September Year: 2017 Address: Copenhagen, Denmark Venue: EMNLP SIG: SIGDAT Publisher: Association for Computational Linguistics Note: Pages: 210–220 Language: URL: DOI: 10.18653/v1/D17-1020 Bibkey: wang-etal-2017-affinity Cite (ACL): Kexiang Wang, Tianyu Liu, Zhifang Sui, and Baobao Chang. The ROUGE evaluations on DUC 2003 topic-focused summarization task and DUC 2004 generic summarization task show the good performance of our method, which has the best ROUGE-2 recall among the graph-based ranking methods. Meanwhile, we put forward adjustable affinity-preserving random walk to enforce the diversity constraint of summarization in the random walk process. This paper introduces affinity-preserving random walk to the summarization task, which preserves the affinity relations of sentences by an absorbing random walk model. Abstract Multi-document summarization provides users with a short text that summarizes the information in a set of related documents.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |