Identifying Student Struggle by Analyzing Facial Movement During Asynchronous Video Lecture Viewing: Towards an Automated Tool to Support Instructors

Linson, Adam; Xu, Yucheng; English, Andrea R. and Fisher, Robert B. (2022). Identifying Student Struggle by Analyzing Facial Movement During Asynchronous Video Lecture Viewing: Towards an Automated Tool to Support Instructors. In: Artificial Intelligence in Education (Rodrigo, M.M.; Matsuda, N.; Cristea, A.I. and Dimitrova, V. eds.), Lecture Notes in Computer Science, Springer, Cham, pp. 53–65.



The widespread shift in higher education (HE) from in-person instruction to pre-recorded video lectures means that many instructors have lost access to real-time student feedback for the duration of any given lecture (a ‘sea of faces’ that express struggle, comprehension, etc.). We hypothesized that this feedback could be partially restored by analyzing student facial movement data gathered during recorded lecture viewing and visualizing it on a common lecture timeline. Our approach builds on computer vision research on engagement and affect in facial expression, and education research on student struggle. Here, we focus on individual student struggle (the effortful attempt to grasp new concepts and ideas) and its group-level visualization as student feedback to support human instructors. Research suggests that instructor supported student struggle can help students develop conceptual understanding, while unsupported struggle can lead to disengagement. Studies of online learning in higher education found that when students struggle with recorded video lecture content, questions and confusion often remain unreported and thus unsupported by instructors. In a pilot study, we sought to identify group-level student struggle by analyzing individual student facial movement during asynchronous video lecture viewing and mapping cohort data to annotated lecture segments (e.g. when a new concept is introduced). We gathered real-time webcam data of 10 student participants and their self-paced intermittent click feedback on personal struggle state, along with retrospective self-reports. We analyzed participant video with computer vision techniques to identify facial movement and correlated the data with independent human observer inferences about struggle-related states. We plotted all participants’ data (computer vision analysis, self-report, observer annotation) along the lecture timeline. The visualization exposed group-level struggle patterns in relation to lecture content, which could help instructors identify content areas where students need additional support, e.g. through student-centered interventions or lecture revisions.

Viewing alternatives

Download history


Public Attention

Altmetrics from Altmetric

Number of Citations

Citations from Dimensions

Item Actions