Multi-stage Multi-task Learning and Attention Module for Multi-choice Reading Comprehension

Conference: ICMLCA 2021 - 2nd International Conference on Machine Learning and Computer Application
12/17/2021 - 12/19/2021 at Shenyang, China

Proceedings: ICMLCA 2021

Pages: 4Language: englishTyp: PDF

Personal VDE Members are entitled to a 10% discount on this title

Authors:
Liu, Jiawen (School of Public Administration, Northwest University, Xi’an, Shanxi, China)
Binren, Xie (Faculty of Science, Xi’an University of Technology, Xi’an, Shanxi, China)

Abstract:
Multi-choice Machine Reading Comprehension (MRC) requires model to decide the correct answer from a set of answer options when given a passage and a question. MRC is one of the most difficult tasks in Reading Comprehension because it often requires more advanced reading comprehension skills such as logical reasoning, summarization, and arithmetic operations, compared to the extractive counterpart where answers are usually spans of text within given passages. Moreover, in addition to a powerful Pre-trained Language Model (PrLM) as encoder, multi-choice MRC especially relies on a matching network design which is supposed to got the relationships among the attributes of passage, question and answers. We introduce the new method, a Multistage Multi-task learning framework for multi-choice MRCOur method consists of two stages: Firstly, We Multi-task learning framework to train extra datasets, its coarse-tuning stage. Secondly, We using the first part of the model continue trained for better with limited data. Furthermore, we propose a new the head module for MRC, it is the multi-head self-attention framework. We tested our method on two complex datasets as Dreams and RACE. Our model can boost the model to make promising progress compared to the baseline.