Student emotion plays an important role in the learning process. So scientists developed software that tracks your facial expressions while you’re learning, such as while the educator is speaking or teaching. Research from North Carolina State University shows that software which tracks facial expressions can accurately assess the emotions of students engaged in interactive online learning and predict the effectiveness of online tutoring sessions. Researchers automatically tracked facial expressions related to anxiety, confusion, engagement, and frustration, according to a recent study, “Automatically Recognizing Facial Expression: Predicting Engagement and Frustration,” presented July 6-9, 2013 at the International Conference on Educational Data Mining, Memphis, Tennessee. You can read online the PDF format article, “Automatically Recognizing Facial Expression: Predicting Engagement and Frustration.”
“This work is part of a larger effort to develop artificial intelligence software to teach students computer science,” says Dr. Kristy Boyer, according to a June 27, 2013 news release, “Researchers Track Facial Expressions To Improve Teaching Software.” Boyer is an assistant professor of computer science at North Carolina State University and co-author of a paper on the work. “The program, JavaTutor, will not only respond to what a student knows, but to each student’s feelings of frustration or engagement. This is important because research shows that student emotion plays an important role in the learning process.”
The researchers used the automated Computer Expression Recognition Toolbox (CERT) program to evaluate facial expressions of 65 college students engaged in one-on-one online tutoring sessions
The researchers found that CERT was able to identify facial movements associated with learning-centered emotions, such as frustration or concentration – and that the automated program’s findings were consistent with expert human assessments more than 85 percent of the time. The research team also had the students report how effective they felt the tutorial was, and tested the students before and after each tutoring session to measure how much they learned.
Scientists used observational data from CERT along with student self-assessments and test results to develop models that could predict how effective a tutorial session was, based on what the facial expressions of the students indicated about each student’s feelings of frustration or engagement. “This work feeds directly into the next stage of JavaTutor system development, which will enable the program to provide cognitive and emotion-based feedback to students,” says Joseph Grafsgaard, a Ph.D, according to the news release. Grafsgaard is a student at North Carolina State University and lead author of the paper.
A co-author of the paper, Joseph Wiggins at the time of the news release is mentioned as an undergraduate at NC State. Dr. Eric Wiebe, is a professor of science, technology, engineering and math education at NC State. And Dr. James Lester, a professor of computer science at NC State. The National Science Foundation supported the research. You may wish to check out the paper, “Automatically Recognizing Facial Expression: Predicting Engagement and Frustration.” Authors of the paper are, Joseph F. Grafsgaard, Joseph B. Wiggins, Kristy Elizabeth Boyer, Eric N. Wiebe and James C. Lester, North Carolina State University.
If you read the abstract of the paper, the researchers explain that learning involves a rich array of cognitive and affective states. Recognizing and understanding these cognitive and affective dimensions of learning is key to designing informed interventions.
Prior research has highlighted the importance of facial expressions in learning-centered affective states, but tracking facial expression poses significant challenges. This paper presents an automated analysis of fine-grained facial movements that occur during computer-mediated tutoring.
Researchers used the Computer Expression Recognition Toolbox (CERT) to track fine-grained facial movements consisting of eyebrow raising (inner and outer)
Within the dataset, upper face movements were found to be predictive of engagement, brow lowering, eyelid tightening, and mouth dimpling. What the scientists found revealed that mouth dimpling was a positive predictor of learning and self-reported performance.
The paper presents a novel validation of an automated tracking tool. You also may be interested in reading more about fine-grained facial expression recognition. The developments introduced in the paper show a next step toward automatically understanding moment-by-moment affective (emotional) states during learning as shown by facial expressions.