Subscribe
About
  • Home
  • /
  • Computing
  • /
  • Recognising students' emotions may improve learning

Recognising students' emotions may improve learning

By Dieketseng Montsi, Senior news journalist
Johannesburg, 05 Aug 2019

Recognising the emotions of students who study computer science might improve learning, according to a University of Johannesburg (UJ) study published by associate professor Dustin Terence van der Haar.

“The Student Emotion Recognition in Computer Science Education: A Blessing or Curse?” study was published last month.

One of the key skills in the fourth industrial revolution is the ability to program. To attain this skill, many prospective students study for a degree in computer science or a related field.

In the research, Van der Haar explores a model for achieving adaptive teaching and learning using computer vision methods to obtain emotion.

Van der Haar is an associate professor in UJ’s Academy of Computer Science and Software Engineering. He has published articles both nationally and internationally in pattern recognition.

The report states that using teaching methods which a student cannot relate to can lead to distance between the taught skill and the student.

The study aims to address this distance by proposing a model that obtains user emotions using technology. This will then allow the educator to adjust teaching and provide a more personalised teaching experience cognisant of classroom concepts while analysing students’ emotions.

Van der Haar used video footage of a small group of computer science lab students captured using a camera placed in front of the classroom of three classes.

In the classroom, nominal lighting was provided and any occlusions within the scene were kept to a minimum. Each video sample contained footage from the beginning of the class until the end, with an average time of 80 minutes.

Once the video was collected, the emotions measured included anger, contempt, disgust, fear, happiness, neutral, sadness and surprise. The outcomes at various stages in the video footage are then observed and any important shifts are noted and collated to derive insights for the study.

Images of less than 40 by 40 pixels were discarded because it is difficult to derive an emotion on such a low resolution with the current emotion classification method.

Emotion scores for each category were captured using Microsoft’s Cognitive Service Face API (version 1.0). The scores and significant events were then displayed with the report module, providing a brief notification on whether a class is going well, or if the educator should adjust their teaching accordingly.

Van der Haar says changing traditional teaching methods to include a more participatory teaching system to maximise student learning has been a challenge, especially in the sciences field.

“The observation proposes a model that derives user sentiment with affective computing methods and leveraging the sentiment outcome to support the educator by providing feedback relevant for teaching. The technology will then allow the educator to adjust teaching.”

The report notes that deriving emotion for adapting teaching also comes with privacy and ethical implications. As with any computer vision technology that involves humans, there is a chance that it can potentially infringe on privacy.

Van der Haar hopes his study will open up a further avenue of research that may assist educational psychologists and educators alike in determining conducive conditions for student learning.

Recently, South African facial recognition start-up Camatica added ‘mood analytics’ algorithms to its suite of facial recognition artificial intelligence (AI)-powered products.

The AI system allows companies to understand when workers are experiencing challenges by analysing their facial expressions and assist them to come up with solutions to help their employees.

Share