Multi-stage Multi-task Learning and Attention Module for Multi-choice Reading Comprehension

Konferenz: ICMLCA 2021 - 2nd International Conference on Machine Learning and Computer Application
17.12.2021 - 19.12.2021 in Shenyang, China

Tagungsband: ICMLCA 2021

Seiten: 4Sprache: EnglischTyp: PDF

Persönliche VDE-Mitglieder erhalten auf diesen Artikel 10% Rabatt

Autoren:
Liu, Jiawen (School of Public Administration, Northwest University, Xi’an, Shanxi, China)
Binren, Xie (Faculty of Science, Xi’an University of Technology, Xi’an, Shanxi, China)

Inhalt:
Multi-choice Machine Reading Comprehension (MRC) requires model to decide the correct answer from a set of answer options when given a passage and a question. MRC is one of the most difficult tasks in Reading Comprehension because it often requires more advanced reading comprehension skills such as logical reasoning, summarization, and arithmetic operations, compared to the extractive counterpart where answers are usually spans of text within given passages. Moreover, in addition to a powerful Pre-trained Language Model (PrLM) as encoder, multi-choice MRC especially relies on a matching network design which is supposed to got the relationships among the attributes of passage, question and answers. We introduce the new method, a Multistage Multi-task learning framework for multi-choice MRCOur method consists of two stages: Firstly, We Multi-task learning framework to train extra datasets, its coarse-tuning stage. Secondly, We using the first part of the model continue trained for better with limited data. Furthermore, we propose a new the head module for MRC, it is the multi-head self-attention framework. We tested our method on two complex datasets as Dreams and RACE. Our model can boost the model to make promising progress compared to the baseline.