Journal of Applied Sciences ›› 2026, Vol. 44 ›› Issue (1): 149-165.doi: 10.3969/j.issn.0255-8297.2026.01.010

• Special Issue on Computer Application • Previous Articles     Next Articles

Multi-granularity Semantic Aspect-Based Sentiment Analysis Model with Fusion of BERT Encoding Layers

XU Kai1, CHI Mingde1, WANG Qi2, LI Jianzhou3, ZHANG Hui1,3   

  1. 1. School of Information, Guizhou University of Finance and Economics, Guiyang 550025, Guizhou, China;
    2. State Key Laboratory of Public Big Data, Guizhou University, Guiyang 550025, Guizhou, China;
    3. Postdoctoral Scientific Research Station, Shijihengtong Technology Co., Ltd., Guiyang 550014, Guizhou, China
  • Received:2025-08-06 Published:2026-02-03

Abstract: Aspect-based sentiment analysis (ABSA) aims to identify the sentiment polarity toward specific aspects within a text. However, existing research still faces multiple challenges: BERT-based approaches suffer from semantic overfitting and insufficient utilization of low-level semantic features; the self-attention mechanism is prone to losing local information; structures with multiple encoding layers and multi-granularity semantics lead to information redundancy. To address these issues, this paper proposed a multi-granularity semantic aspect-based sentiment analysis model with fusion of BERT encoding layers (MSBEL). The model introduced a pyramid attention mechanism to leverage semantic features from various encoding layers, and was combined with low-level encoders to mitigate overfitting. It employed multi-scale gated convolution to enhance its capability in handling local information loss and utilized cosine attention to highlight sentiment features relevant to aspect terms, thereby reducing information redundancy. t-SNE visualization demonstrates that the clustering effect of sentiment representations of MSBEL is superior to that of BERT. MSBEL was compared with mainstream models on multiple benchmark datasets. Compared with LCF-BERT, it achieves F1 improvements of 1.53%, 3.94%, 1.39%, 6.68%, and 5.97% on five datasets. In comparison with SenticGCN, it achieves an average increase of F1 by 0.94% and a maximum increase of 2.12%. Compared with ABSA-DeBERTa, MSBEL increases the F1 by 1.16% on average and achieves a maximum increase of 4.20%. These results validate the effectiveness and superiority of the proposed model for ABSA tasks.

Key words: aspect-based sentiment analysis, multi-granularity, BERT, pyramid attention mechanism, multi-scale gated convolutional unit

CLC Number: