Publication:
Temporal Information Extraction for Question Answering Using Syntactic Dependencies in an LSTM-based Architecture
Yuanliang Meng, Anna Rumshisky, Alexey Romanov • @Conference on Empirical Methods in Natural Language Processing • 01 March 2017
TLDR: This paper proposes to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text using the shortest dependency path between entities as input, and conducts intrinsic evaluation and post state-of-the-art results on Timebank-Dense.
Citations: 47
Abstract: In this paper, we propose to use a set of simple, uniform in architecture LSTM-based models to recover different kinds of temporal relations from text. Using the shortest dependency path between entities as input, the same architecture is used to extract intra-sentence, cross-sentence, and document creation time relations. A “double-checking” technique reverses entity pairs in classification, boosting the recall of positive cases and reducing misclassifications between opposite classes. An efficient pruning algorithm resolves conflicts globally. Evaluated on QA-TempEval (SemEval2015 Task 5), our proposed technique outperforms state-of-the-art methods by a large margin. We also conduct intrinsic evaluation and post state-of-the-art results on Timebank-Dense.
Related Fields of Study
loading
Citations
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next
References
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next