Scaling up Student Submission Integrity Diagnosis (SSID)

Kan Min-Yen

Overview

The discussion forum is a key device for instructors for interacting with students. In courses where the faculty-student ratio is lopsided, it can be difficult for instructors to keep abreast of forum activity and to know which student posts are the important ones to invest time responding to. Andrew Ng, Coursera’s co-founder [1], once remarked at during a keynote that one of his absentminded replies (hereafter also referred to by its technical terminology, interventions) to a student post on his own machine learning MOOC caused much confusion, as his post was not clear. Upon reflection, he revisited his replies only to find that many students had consumed much time trying to figure out his meaning. If a learning management system can help identify critical conversations and align relevant resources to instructors, then perhaps instructor intervention can be made more effective.

Optimal forum intervention can help present key learning opportunities for critical thinking, reflection and provide clarity in a timely fashion. We investigate how to analyze student discussions in forums to automatically rank threads in terms of their importance in needing instructor intervention. We will use instructor intervention data from Massive Online Open Courses (MOOCs) to build a machine learned model of current intervention and evaluate the model for use on MOOCs and also local forums (such as IVLE and Luminus).

As not all instructor interventions lead to learning, we have also annotated existing instructor interventions against a typology based on transactivity – an analytic discourse model that frames interactions as ones that build upon each other – to differentiate which interventions may lead to learning opportunities and the students’ critical thinking.