Journal of Applied Sciences ›› 2026, Vol. 44 ›› Issue (1): 34-49.doi: 10.3969/j.issn.0255-8297.2026.01.003

• Special Issue on Computer Application • Previous Articles     Next Articles

Complex Logical Query Model Based on Improved Transformer

CHEN Yuyin1, LI Guanfeng1,2, QIN Jing1, XIAO Yuhang1   

  1. 1. School of Information Engineering, Ningxia University, Yinchuan 750021, Ningxia, China;
    2. Ningxia 'East Data West Computing' Key Laboratory of Artificial Intelligence and Information Security, Yinchuan 750021, Ningxia, China
  • Received:2025-08-08 Published:2026-02-03

Abstract: With the widespread application of knowledge graphs in scenarios such as intelligent question answering and recommendation systems, answering complex logical queries on incomplete knowledge graphs has become the focus and difficulty of current research. In view of the fact that ordinary embedding-based methods need to be trained on complex logical queries and cannot be well generalized to query structures outside the distribution, this paper proposed an improved-Transformer-based model DCMHA-MoE that integrated the dynamically composable multi-head attention (DCMHA) mechanism and the mixture-of-experts (MoE) network. This model represented complex query graphs as sequence inputs through triple transformation and bidirectional path encoding technology, and dynamically modeled the structural dependencies and semantic interactions therein, so that complex logical queries can be realized. The DCMHA realized the adaptive combination of attention heads to enhance the semantic expression ability. The MoE network introduced a sparse activation mechanism to improve the adaptability to different query structures and reduce the computational cost. Experiments were conducted on the FB15K-237 and NELL-995 datasets. The results show that compared with the baseline model DiffCLR, the DCMHA-MoE model improves the mean reciprocal rank (MRR) in existential positive first-order logic (EPFO) query $(\wedge, \vee)$ by 10.4% and 7.2%, respectively, which verifies the effectiveness and superiority of DCMHA-MoE in complex logical query tasks.

Key words: complex logical query, knowledge graph, Transformer, dynamic multi-head attention mechanism, mixture-of-experts network

CLC Number: