Please wait a minute...
Journal of ZheJiang University (Engineering Science)  2023, Vol. 57 Issue (3): 437-445    DOI: 10.3785/j.issn.1008-973X.2023.03.001
    
Aspect level sentiment analysis based on relation gated graph convolutional network
Yan-fen CHENG(),Jia-jun WU,Fan HE
School of Computer Science and Artificial Intelligence, Wuhan University of Technology, Wuhan 430070, China
Download: HTML     PDF(1096KB) HTML
Export: BibTeX | EndNote (RIS)      

Abstract  

In aspect level sentiment analysis, existing methods struggle to effectively utilize the types of syntactic relations, and the performance of the model is affected by the accuracy of the dependency parsing. To resolve these challenges, an attention augmented relation gated graph convolutional network (ARGCN) model was proposed. The model uses a bidirectional long-short-term memory (BiLSTM) network to learn the sequential feature of sentences, and combines feature with the dependency probability matrix to construct a word graph. Then the model uses a relation gated graph convolutional network (RG-GCN) and an attention augmented network (AAN) to obtain the sentiment features of aspect words from the word graph and the sequential feature of sentences, respectively. Finally, the outputs of RG-GCN and AAN are concatenated as the final sentiment feature of aspect words. Contrastive experiments and ablation experiments were conducted on SemEval 2014 and Twitter datasets. And the results show that the ARGCN model can effectively utilize relation types, reduce the impact of dependency parsing accuracy on its performance, and better establish the connection between aspect words and opinion words. The model accuracy is better than all baseline models.



Key wordsaspect level sentiment analysis      graph convolutional network      attention mechanism      dependency tree      gate mechanism      natural language processing     
Received: 01 April 2022      Published: 31 March 2023
CLC:  TP 391.1  
Cite this article:

Yan-fen CHENG,Jia-jun WU,Fan HE. Aspect level sentiment analysis based on relation gated graph convolutional network. Journal of ZheJiang University (Engineering Science), 2023, 57(3): 437-445.

URL:

https://www.zjujournals.com/eng/10.3785/j.issn.1008-973X.2023.03.001     OR     https://www.zjujournals.com/eng/Y2023/V57/I3/437


基于关系门控图卷积网络的方面级情感分析

在方面级情感分析任务中,现有方法难以有效利用句法关系类型且性能依赖依存解析的准确性,为此提出注意力增强的关系门控图卷积神经网络(ARGCN)模型. 该模型将双向长短时记忆(BiLSTM)网络学习得到的句子顺序特征与依存概率矩阵相结合构建单词图;利用关系门控图卷积神经网络(RG-GCN)和注意力增强网络(AAN)分别从单词图和句子的顺序特征中获取方面词的情感特征;拼接RG-GCN和AAN的输出作为方面词最终的情感特征. 在数据集 SemEval 2014 、 Twitter 上进行对比实验和消融实验,结果表明ARGCN模型可以有效地利用关系类型,减小依存解析准确性对模型性能的影响,更好地建立方面词和意见词的联系,模型准确率优于所有基线模型.


关键词: 方面级情感分析,  图卷积网络,  注意力机制,  依存树,  门机制,  自然语言处理 
Fig.1 Structural diagram of dependency tree
Fig.2 Architecture of attention augmented relation gated graph convolutional network
Fig.3 Comparison of dependency probability matrix and adjacency matrix
Fig.4 Architecture of attention augmentation network
数据集 Ntr Nte
正面 中性 负面 正面 中性 负面
Restaurant 2164 637 807 727 196 196
Laptop 976 455 851 337 167 128
Twitter 1507 3016 1528 172 336 169
Tab.1 Sample label distribution for each dataset
%
模型 Restaurant Laptop Twitter
Acc MF1 Acc MF1 Acc MF1
CDT 82.30 74.02 77.19 72.99 74.66 73.66
IGATs 82.32 73.99 76.02 72.05 75.29 73.40
R-GAT 83.30 76.08 77.42 73.76 75.57 73.82
DGEDT 83.90 75.10 76.80 72.30 74.80 73.40
DualGCN 84.27 78.08 78.48 74.74 75.92 74.29
ARGCN 84.63 78.14 79.27 75.68 76.07 74.58
ARGCN+BERT 86.60 77.73 81.01 77.73 77.10 76.07
Tab.2 Comparison of classification accuracy and macro-F1 score of different models on three datasets
模型 Acc/%
Restaurant Laptop Twitter
GCN 82.31 75.95 74.59
RG-GCN 83.20 77.22 74.89
AAN 83.47 78.32 75.33
ARGCN 84.63 79.27 76.06
Tab.3 Comparison of aspect-level sentiment analysis and prediction ablation experimental results
序号 例句 GCN RG-GCN AAN ARGCN
1 The $\underline{{\rm{food}}}$ not worth the price. N√ N√ N√
2 The $\underline{{\rm{settings}}}$ are not user-friendly either. N√ N√
3 I thought that is will be fine, if I do some $\underline{{\rm{settings}}}$. O√ O√ O√
Tab.4 Examples from test dataset and predictions of each model
Fig.5 Dependency tree and attention weight of example 2
模型 η/106 模型 η/106
CDT 0.41 RGAT 1.10
DualGCN 0.61 IGATs 1.81
ARGCN 0.64 DGEDT 2.15
Tab.5 Number of trainable parameters for common models
Fig.6 Performance of attention augmented relation gated graph convolutional network model varies with number of layers on three datasets
Fig.7 Performance of attention augmented relation gated graph convolutional network model varies with regularization factors
[1]   KIPF T N, WELLING M. Semi-supervised classification with graph convolutional networks [EB/OL]. (2017-02-22)[2022-04-01]. https://arxiv.org/pdf/1609.02907.pdf.
[2]   VELIČKOVIĆ P, CUCURULL G, CASANOVA A, et al. Graph attention networks [EB/OL]. (2018-02-04)[2022-04-01]. https://arxiv.org/pdf/1710.10903.pdf.
[3]   SCHLICHTKRULL M, KIPF T N, BLOEM P, et al. Modeling relational data with graph convolutional networks [C]// European Semantic Web Conference. [S. l.]: Springer, 2018: 593-607.
[4]   KIRITCHENKO S, ZHU X, CHERRY C, et al. NRC-Canada-2014: detecting aspects and sentiment in customer reviews [C]// Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin: Association for Computational Linguistics, 2014: 437-442.
[5]   WAGNER J, ARORA P, CORTES S, et al. DCU: aspect-based polarity classification for SemEval task 4 [C]// Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin: Association for Computational Linguistics, 2014: 223–229.
[6]   TANG D, QIN B, LIU T. Aspect level sentiment classification with deep memory network [EB/OL]. (2016-09-24)[2022-04-01]. https://arxiv.org/pdf/1605.08900.pdf.
[7]   CHEN P, SUN Z, BING L, et al. Recurrent attention network on memory for aspect sentiment analysis [C]// Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. Copenhagen: Association for Computational Linguistics, 2017: 452-461.
[8]   MA D, LI S, ZHANG X, et al. Interactive attention networks for aspect-level sentiment classification [EB/OL]. (2017-09-04)[2022-04-01].https://arxiv.org/pdf/1709.00893.pdf.
[9]   ZENG B, YANG H, XU R, et al LCF: a local context focus mechanism for aspect-based sentiment classification[J]. Applied Sciences, 2019, 9 (16): 3389
doi: 10.3390/app9163389
[10]   SONG Y, WANG J, JIANG T, et al. Attentional encoder network for targeted sentiment classification [EB/OL]. (2019-04-01)[2022-04-01]. https://arxiv.org/pdf/1902.09314.pdf.
[11]   XU Q, ZHU L, DAI T, et al Aspect-based sentiment classification with multi-attention network[J]. Neurocomputing, 2020, 388: 135- 143
doi: 10.1016/j.neucom.2020.01.024
[12]   SONG W, WEN Z, XIAO Z, et al Semantics perception and refinement network for aspect-based sentiment analysis[J]. Knowledge-Based Systems, 2021, 214: 106755
doi: 10.1016/j.knosys.2021.106755
[13]   宋威, 温子健 基于特征双重蒸馏网络的方面级情感分析[J]. 中文信息学报, 2021, 35 (7): 126- 133
SONG Wei, WEN Zi-jian Feature dual distillation network for aspect-based sentiment analysis[J]. Journal of Chinese Information Processing, 2021, 35 (7): 126- 133
[14]   毛腾跃, 郑志鹏, 郑禄 基于改进自注意力机制的方面级情感分析[J]. 中南民族大学学报:自然科学版, 2022, 41 (1): 94- 100
MAO Teng-yue, ZHENG Zhi-peng, ZHENG Lu Aspect-level sentiment analysis based on improved self-attention mechanism[J]. Journal of South-Central University for Nationalities: Natural Science Edition, 2022, 41 (1): 94- 100
[15]   SUN K, ZHANG R, MENSAH S, et al. Aspect-level sentiment analysis via convolution over dependency tree [C]// Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing. Hong Kong: Association for Computational Linguistics, 2019: 5679-5688.
[16]   HUANG B, CARLEY K M. Syntax-aware aspect level sentiment classification with graph attention networks [EB/OL]. (2019-09-05)[2022-04-01]. https://arxiv.org/pdf/1909.02606.pdf.
[17]   ZHANG C, LI Q, SONG D. Aspect-based sentiment classification with aspect-specific graph convolutional networks [EB/OL]. (2019-10-13)[2022-04-01]. https://arxiv.org/pdf/1909.03477.pdf.
[18]   WANG K, SHEN W, YANG Y, et al. Relational graph attention network for aspect-based sentiment analysis [EB/OL]. (2020-04-26)[2022-04-01]. https://arxiv.org/pdf/2004.12362.pdf.
[19]   TANG H, JI D, LI C, et al. Dependency graph enhanced dual-transformer structure for aspect-based sentiment classification [C]// Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics. Online: Association for Computational Linguistics, 2020: 6578-6588.
[20]   VASWANI A, SHAZEER N, PARMAR N, et al Attention is all you need[J]. Advances in Neural Information Processing systems, 2017, 30
[21]   LI R, CHEN H, FENG F, et al. Dual graph convolutional networks for aspect-based sentiment analysis [C]// Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing. Online: Association for Computational Linguistics, 2021: 6319-6329.
[22]   LI Y, SUN X, WANG M. Embedding extra knowledge and a dependency tree based on a graph attention network for aspect-based sentiment analysis [C]// 2021 International Joint Conference on Neural Networks. Shenzhen: IEEE, 2021: 1-8.
[23]   张合桥, 苟刚, 陈青梅 基于图神经网络的方面级情感分析[J]. 计算机应用研究, 2021, 38 (12): 3574- 3580
ZHANG He-qiao, GOU Gang, CHEN Qing-mei Aspect-based sentiment analysis based on graph neural network[J]. Application Research of Computers, 2021, 38 (12): 3574- 3580
[24]   韩虎, 吴渊航, 秦晓雅 面向方面级情感分析的交互图注意力网络模型[J]. 电子与信息学报, 2021, 43 (11): 3282- 3290
HAN Hu, WU Yuan-hang, QIN Xiao-ya An interactive graph attention networks model for aspect-level sentiment analysis[J]. Journal of Electronics and Information Technology, 2021, 43 (11): 3282- 3290
[25]   夏鸿斌, 顾艳, 刘渊 面向特定方面情感分析的图卷积过度注意(ASGCN-AOA)模型[J]. 中文信息学报, 2022, 36 (3): 146- 153
XIA Hong-bin, GU Yan, LIU Yuan Graph convolution overattention(ASGCN-AOA) model for specific aspects of sentiment analysis[J]. Journal of Chinese Information Processing, 2022, 36 (3): 146- 153
[26]   PONTIKI M, GALANIS D, PAVLOPOULOS J, et al, SemEval-2014 task 4: aspect based sentiment analysis [C]// Proceedings of the 8th International Workshop on Semantic Evaluation. Dublin: Association for Computational Linguistics, 2014: 27–35.
[27]   DONG L, WEI F, TAN C, et al. Adaptive recursive neural network for target-dependent twitter sentiment classification [C]// Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics. Baltimore: Association for Computational Linguistics, 2014: 49–54.
[28]   MRINI K, DERNONCOURT F, TRAN Q, et al. Rethinking self-attention: an interpretable self-attentive encoder-decoder parser [EB/OL]. (2020-10-29)[2022-04-01]. https://arxiv.org/pdf/1911.03875.pdf.
[29]   PENNINGTON J, SOCHER R, MANNING C D. GloVe: global vectors for word representation [C]// Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing. Doha: Association for Computational Linguistics, 2014: 1532-1543.
[1] Jing-jing ZHANG,Zhao-gong ZHANG,Xin XU. Graph convolution collaborative filtering model combining graph enhancement and sampling strategies[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 243-251.
[2] Fan YANG,Bo NING,Huai-qing LI,Xin ZHOU,Guan-yu LI. Multimodal image retrieval model based on semantic-enhanced feature fusion[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 252-258.
[3] Chao LIU,Bing KONG,Guo-wang DU,Li-hua ZHOU,Hong-mei CHEN,Chong-ming BAO. Deep clustering via high-order mutual information maximization and pseudo-label guidance[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(2): 299-309.
[4] Lin-tao WANG,Qi MAO. Position measurement method for tunnel segment grabbing based on RGB and depth information fusion[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(1): 47-54.
[5] Li-zhou FENG,Yang YANG,You-wei WANG,Gui-jun YANG. New method for news recommendation based on Transformer and knowledge graph[J]. Journal of ZheJiang University (Engineering Science), 2023, 57(1): 133-143.
[6] Kun HAO,Kuo WANG,Bei-bei WANG. Lightweight underwater biological detection algorithm based on improved Mobilenet-YOLOv3[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(8): 1622-1632.
[7] Yue HOU,Cheng-yan HAN,Xin ZHENG,Zhi-yuan DENG. Traffic flow data repair method based on spatial-temporal fusion graph convolution[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(7): 1394-1403.
[8] Ren-peng MO,Xiao-sheng SI,Tian-mei LI,Xu ZHU. Bearing life prediction based on multi-scale features and attention mechanism[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(7): 1447-1456.
[9] Ce GUO,Zhi-wen ZENG,Peng-ming ZHU,Zhi-qian ZHOU,Hui-min LU. Decentralized swarm control based on graph convolutional imitation learning[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(6): 1055-1061.
[10] You-wei WANG,Shuang TONG,Li-zhou FENG,Jian-ming ZHU,Yang LI,Fu CHEN. New inductive microblog rumor detection method based on graph convolutional network[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(5): 956-966.
[11] Xiao-chen JU,Xin-xin ZHAO,Sheng-sheng QIAN. Self-attention mechanism based bridge bolt detection algorithm[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(5): 901-908.
[12] Xue-qin ZHANG,Tian-ren LI. Breast cancer pathological image classification based on Cycle-GAN and improved DPN network[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(4): 727-735.
[13] Meng XU,Dan WANG,Zhi-yuan LI,Yuan-fang CHEN. IncepA-EEGNet: P300 signal detection method based on fusion of Inception network and attention mechanism[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(4): 745-753, 782.
[14] Chang-yuan LIU,Xian-ping HE,Xiao-jun BI. Efficient network vehicle recognition combined with attention mechanism[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(4): 775-782.
[15] Qiao-hong CHEN,Hao-lei PEI,Qi SUN. Image caption based on relational reasoning and context gate mechanism[J]. Journal of ZheJiang University (Engineering Science), 2022, 56(3): 542-549.