计算机应用专辑

实体类别信息增强的命名实体识别算法

展开
  • 1. 清华大学 计算机科学与技术系, 北京 100084;
    2. 中国科普研究所, 北京 100081;
    3. 北京彩智科技有限公司, 北京 100081

收稿日期: 2022-06-30

  网络出版日期: 2023-02-03

基金资助

中国科普研究所合作项目基金(No.200110EMR028)资助

Named Entity Recognition Algorithm Enhanced with Entity Category Information

Expand
  • 1. Department of Computer Science and Technology, Tsinghua University, Beijing 100084, China;
    2. China Research Institute for Science Popularization, Beijing 100081, China;
    3. Beijing Caizhi Technology Co., Ltd., Beijing 100081, China

Received date: 2022-06-30

  Online published: 2023-02-03

摘要

中文命名实体识别(named entity recognition,NER)字符级别模型会忽略句子中词语的信息,为此提出了一种基于知识图谱中实体类别信息增强的中文NER方法。首先,使用分词工具对训练集进行分词,选出所有可能的词语构建词表;其次,利用通用知识图谱检索词表中实体的类别信息,并以简单有效的方式构建与字符相关的词集,根据词集中实体对应的类别信息生成实体类别信息集合;最后,采用词嵌入的方法将类别信息的集合转换成嵌入与字符嵌入拼接,以此丰富嵌入层生成的特征。所提出的方法可以作为嵌入层扩充特征多样性的模块使用,也可与多种编码器-解码器的模型结合使用。在微软亚洲研究院提出的中文NER数据集上的实验展现了该模型的优越性,相较于双向长短期记忆网络与双向长短期记忆网络+条件随机场模型,在评价指标F1上分别提升了11.00%与3.09%,从而验证了知识图谱中实体的类别信息对中文NER增强的有效性。

本文引用格式

刘明辉, 唐望径, 许斌, 仝美涵, 王黎明, 钟琦, 徐剑军 . 实体类别信息增强的命名实体识别算法[J]. 应用科学学报, 2023 , 41(1) : 1 -9 . DOI: 10.3969/j.issn.0255-8297.2023.01.001

Abstract

To solve the problem that the character level model of Chinese named entity recognition (NER) may ignore word information in sentences, a Chinese NER method based on entity category information enhancement in knowledge graph was proposed. Firstly, a training set was segmented with word segmentation tool, and all possible words were selected to construct a vocabulary. Secondly, the category information of entities in the vocabulary was retrieved by using generic knowledge graph, to construct a word set related to characters in a simple and effective way, and an entity category information set is generated according to the category information of entities in the word set. Finally, word embedding method was used to convert the set of category information into embeddings and concatenation of character embeddings, so as to enrich features in embedding layer. The proposed method can either be used as a module to expand feature diversity of embedding layer, or jointly applies with a variety of encoder-decoder models. Experiments on the Chinese NER dataset proposed by Microsoft Research Asia (MSRA) show the superiority of the proposed model. Compared with the models of Bi-directional long short-term memory (Bi-LSTM) and Bi-LSTM plus with conditional random field (CRF), the proposed method increases F1 by 11.00% and 3.09% respectively, verifying that the category information of entities in knowledge graph performs high effectiveness in the enhancement of Chinese NER.

参考文献

[1] Li J, Sun A, Han J, et al. A survey on deep learning for named entity recognition[J]. IEEE Transactions on Knowledge and Data Engineering, 2020, 34(1):50-70.
[2] Chiticariu L, Krishnamurthy R, Li Y, et al. Domain adaptation of rule-based annotators for named-entity recognition tasks[C]//Proceedings of 2010 Conference on Empirical Methods in Natural Language Processing, 2010:1002-1012.
[3] Cortes C, Vapnik V. Support-vector networks[J]. Machine Learning, 1995, 20(3):273-297.
[4] Ekbal A, Bandyopadhyay S. Named entity recognition using support vector machine:a language independent approach[J]. International Journal of Electrical and Computer Engineering, 2010, 4(3):589-604.
[5] Bikel D M, Miller S, Schwartz R, et al. Nymble:a high-performance learning name-finder[J]. Proceedings of the Fifth Conference on Applied Natural Language Processing, 1998:194-201.
[6] Morwal S, Jahan N, Chopra D. Named entity recognition using hidden Markov model (HMM)[J]. International Journal on Natural Language Computing, 2012, 1(4):15-23.
[7] Mccallum A, Li W. Early results for named entity recognition with conditional random fields, feature induction and Web-enhanced lexicons LR-CNN[J]. Proceedings of the Seventh Conference on Natural Language Learning at HLT-NAACL, 2003, 4:188-191.
[8] Huang Z, Xu W, Yu K. Bidirectional LSTM-CRF models for sequence tagging[J/OL]. arXiv preprint arXiv:1508.01991, 2015. (2015-08-09)[2022-06-21]. https://arxiv.org/abs/1508.01991.
[9] 邓依依, 邬昌兴, 魏永丰, 等. 基于深度学习的命名实体识别综述[J]. 中文信息学报, 2021, 35(9):30-45. Deng Y Y, Wu C X, Wei Y F, et al. A survey of named entity recognition based on deep learning[J]. Journal of Chinese Information Processing, 2021, 35(9):30-45. (in Chinese)
[10] Ma X, Hovy E. End-to-end sequence labeling via bi-directional LSTM-CNNS-CRF[J]. Proceedings of the 54th Annual Meeting of Association for Computational Linguistics, Volume 1(Long Papers), 2016:1064-1074.
[11] Zhang Y, Yang J. Chinese NER using lattice LSTM[J]. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics, Volume 1(Long Papers), 2018:1554-1564.
[12] Gui T, Ma R, Zhang Q, et al. CNN-based Chinese NER with lexicon rethinking[C]//Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, 2019:4982-4988.
[13] Sui D, Chen Y, Liu K, et al. Leverage lexical knowledge for Chinese named entity recognition via collaborative graph network[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019:3830-3840.
[14] Gui T, Zou Y, Zhang Q, et al. A lexicon-based graph neural network for Chinese NER[C]//Proceedings of 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing, 2019:1040-1050.
[15] Li X, Yan H, Qiu X, et al. FLAT:Chinese NER using flat-lattice transformer[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020:6836-6842.
[16] Vaswani A, Shazeer N, Parmar N, et al. Attention is all you need[C]//Proceedings of the 31st International Conference on Neural Information Processing Systems, 2017:6000-6010.
[17] 钟诗胜, 陈曦, 赵明航, 等. 引入词集级注意力机制的中文命名实体识别方法[J]. 吉林大学学报(工学版), 2022, 52(5):1098-1105. Zhong S S, Chen X, Zhao M H, et al. Chinese named entity recognition method with the introduction of word-set level attention mechanism[J]. Journal of Jilin University (Engineering and Technology Edition), 2022, 52(5):1098-1105. (in Chinese)
[18] Liu W, Xu T, Xu Q, et al. An encoding strategy based word-character LSTM for Chinese NER[C]//Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics:Human Language Technologies, Volume 1(Long and Short Papers), 2019:2379-2389.
[19] Ding R, Xie P, Zhang X, et al. A neural multi-digraph model for Chinese NER with gazetteers[C]//Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, 2019:1462-1467.
[20] Ma R, Peng M, Zhang Q, et al. Simplify the usage of lexicon in Chinese NER[C]//Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2019:5951-5960.
[21] Rajpurkar P, Zhang J, Lopyrev K, et al. SQuAD:100000+ questions for machine comprehension of text[C]//Proceedings of 2016 Conference on Empirical Methods in Natural Language Processing, 2016:2383-2392.
[22] Palshikar G K. Techniques for named entity recognition:a survey[M]. Bioinformatics:Concepts, Methodologies, Tools, and Applications.[S.l.]:IGI Global, 2013:400-426.
文章导航

/