Math Question Solving and MCQ Distractor Generation with attentional GRU Networks
Neisarg Dave, Riley Owen Bakes, Bart Pursel, C. Lee Giles
Jul 02, 2021 20:50 UTC+2
—
Session I2
—
Zoom link
Keywords: Math Question Solving, Distractor Generation, Math Multiple Choice Questions, Mathematical Language, Math Education
Abstract:
We investigate the encoder-decoder GRU networks with attention mechanism for solving a diverse array of elementary math problems with mathematical symbolic structures in them. We quantitatively measure performances of recurrent models on a given question type using a test set of unseen problems with a binary scoring and partial credit system. From our findings, we propose the use of encoder-decoder recurrent neural networks for the generation of mathematical multiple-choice question distractors. We introduce a computationally inexpensive decoding schema called character offsetting, which qualitatively and quantitatively shows promise for doing so for several question types. Character offsetting involves freezing the hidden state and top k probabilities of a decoder's initial probability outputs given the input of an encoder, then performing k basic greedy decodings given each of the frozen outputs as the initialization for decoded sequence.