Publication:
Syntax-Aware Graph Attention Network for Aspect-Level Sentiment Classification
Lianzhe Huang, Xin Sun, Sujian Li, Linhao Zhang, Houfeng Wang • @International Conference on Computational Linguistics • 01 December 2020
TLDR: This paper exploits syntactic awareness to the model by the graph attention network on the dependency tree structure and external pre-training knowledge by BERT language model, which helps to model the interaction between the context and aspect words better.
Citations: 58
Abstract: Aspect-level sentiment classification aims to distinguish the sentiment polarities over aspect terms in a sentence. Existing approaches mostly focus on modeling the relationship between the given aspect words and their contexts with attention, and ignore the use of more elaborate knowledge implicit in the context. In this paper, we exploit syntactic awareness to the model by the graph attention network on the dependency tree structure and external pre-training knowledge by BERT language model, which helps to model the interaction between the context and aspect words better. And the subwords of BERT are integrated into the dependency tree graphs, which can obtain more accurate representations of words by graph attention. Experiments demonstrate the effectiveness of our model.
GraphsLanguage Models & Neural NetworksKnowledge BasesKnowledge RepresentationLanguage Model MechanismsSyntactic Text ProcessingNatural Language ProcessingInformation Extraction & Text MiningMultimodalityAspect-based Sentiment AnalysisStructured Data in NLPSentiment AnalysisSemantic Text ProcessingInformation RetrievalSyntactic ParsingAttentionText Classification
Related Fields of Study
loading
Citations
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next
References
Sort by
Previous
Next
Showing results 1 to 0 of 0
Previous
Next