Early Detection of At-risk Students based on Knowledge Distillation RNN Models
Ryusuke Murata, Tsubasa Minematsu, Atsushi Shimada
Jul 02, 2021 14:10 UTC+2
—
Session PS2
—
Gather Town
Keywords: Students' performance prediction, Early detection of at-risk student, Recurrent Neural Network, Knowledge Distillation
Abstract:
Recurrent neural network (RNN) achieves state-of-the-art in several researches of the performance prediction. However, accuracy in early time steps is lower than that in late time steps, even though the early detection of at-risk students is important for timely interventions. To improve the accuracy in early time steps, we propose a knowledge distillation method for RNN. Our method distills the time-series information in the RNN model of late time steps into the RNN model of early time steps. This distillation makes the prediction of early time steps closer to that of late time steps. The experimental result showed that our method improved the detection rate of at-risk students compared with traditional RNNs, especially in early time steps.