NLP-KG
Semantic Search

Publication:

A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation

Jian GuanFei HuangZhihao ZhaoXiaoyan ZhuMinlie Huang • @Transactions of the Association for Computational Linguistics • 01 January 2020

TLDR: A knowledge-enhanced pretraining model to utilize commonsense knowledge from external knowledge bases to generate reasonable stories that can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.

Citations: 219
Abstract: Story generation, namely, generating a reasonable story from a leading context, is an important but challenging task. In spite of the success in modeling fluency and local coherence, existing neural language generation models (e.g., GPT-2) still suffer from repetition, logic conflicts, and lack of long-range coherence in generated stories. We conjecture that this is because of the difficulty of associating relevant commonsense knowledge, understanding the causal relationships, and planning entities and events with proper temporal order. In this paper, we devise a knowledge-enhanced pretraining model for commonsense story generation. We propose to utilize commonsense knowledge from external knowledge bases to generate reasonable stories. To further capture the causal and temporal dependencies between the sentences in a reasonable story, we use multi-task learning, which combines a discriminative objective to distinguish true and fake stories during fine-tuning. Automatic and manual evaluation shows that our model can generate more reasonable stories than state-of-the-art baselines, particularly in terms of logic and global coherence.

Related Fields of Study

loading

Citations

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next

References

Sort by
Previous
Next

Showing results 1 to 0 of 0

Previous
Next