Design and implementation of LSTM neural network text classification model integrating attention mechanism

Konferenz: CAIBDA 2022 - 2nd International Conference on Artificial Intelligence, Big Data and Algorithms
17.06.2022 - 19.06.2022 in Nanjing, China

Tagungsband: CAIBDA 2022

Seiten: 5Sprache: EnglischTyp: PDF

Autoren:
Wang, Pengbo (Dukekunshan University, Kunshan, China)

Inhalt:
Because of the traditional text classification model such as CNN, LSTM cannot represent the weight of each word in the categorization when extracting text features, and this paper proposes the LSTM-ATT model based on the combination of the LSTM model and attention mechanism: First, the CBOW model in Word2VEc is used to transform the text data into a low-dimensional real number vector, which is used as the input of the LSTM layer to extract features in the context of the associated text. Second, the attention mechanism is used to judge the influence of different words in the text on classification. Third, the text feature vectors processed layer by layer by the neural network are classified by the Softmax classifier. Experimental results on data sets show that the LSTM-ATT model has a better classification effect than the CNN model and LSTM model, and including an attention mechanism into text categorization can boost the classification impact significantly.