Research named entity recognition based on Transfer Learning

Conference: CIBDA 2022 - 3rd International Conference on Computer Information and Big Data Applications
03/25/2022 - 03/27/2022 at Wuhan, China

Proceedings: CIBDA 2022

Pages: 5Language: englishTyp: PDF

Authors:
Yu, Ai-Rong; Niu, Yan-Jie; Yu, Pei-Yi (The Army Engineering University of PLA, Nanjing, China)
Wang, Jun (Nanjing Vocational College of Information Technology, Nanjing, China)

Abstract:
BERT model can effectively deal with named entity recognition of general domain texts. However, in specific frontier scientific research fields, due to the lack of special name corpus for scientific research, the performance of named entity recognition will decline. In order to solve this problem, a transfer learning text processing model Bert bilstm CRF is proposed. The model uses the self attention mechanism of transformer two-way semantic representation model (BERT) to generate the required basic word vector, combines the memory ability of two-way long-term and short-term memory network (BiLSTM) model to complete the memory ability of context relationship, and realizes the learning of annotation rules in conditional random field model (CRF).Through canonical correlation analysis algorithm, the difference of word vector feature space between source domain and target domain is bridged, and the domain migration of the basic model is realized. The training, parameter tuning and effect verification of the framework are carried out on the corpus containing 298 scientific research consulting. The experimental results show that the accuracy, recall and F1 value of the model are greatly improved compared with the scientific research named entity recognition method.