Computers are astounding devices, but they aren’t great listeners. They can’t lend a hand when users struggle to find a file, don’t understand what they are reading, or fall asleep studying for a test.
But that may all change someday soon.
Sidney D’Mello, an assistant professor of psychology and computer science at the University of Notre Dame, is tackling research at the intersection of cognition and emotion during complex learning and problem-solving.
Through several projects he’s leading or collaborating on, D’Mello is creating real-time computational models built from extensive lab- and school-based research, with the long-term, big-picture goal of making computers more humanlike so they can guide us in learning—at work, at school, and in daily life.
“Shout at a computer and it ignores you,” D’Mello said of current technology. “Can you imagine a layer of computing that goes beyond simply issuing instructions or gestures and, in five to 10 years, when you’re confused or frustrated with what you are doing, the system helps you along?”
Start with a Question
To arrive at that future, D’Mello and his students and collaborators started with a question: “What do students feel and think when they learn?”
Rather than quizzing students at the end of a class session to see how much information they absorbed, they’re using technology to monitor things like facial expressions, body movements, eye gaze, and speech patterns in order to understand the emotions—confusion, boredom, frustration, etc.—that, if left unchecked, can derail the learning process.
“Zoning out can also be quite detrimental,” D’Mello said. “I need to comprehend the information on one page to make an inference about what is happening on the second page, and if I break that connection through zoning out, I suddenly have a very shallow understanding of the material—which, ironically, leads to more zone outs.”
Shout at a computer and it ignores you. Can you imagine a layer of computing that goes beyond simply issuing instructions or gestures and, in five to 10 years, when you’re confused or frustrated with what you are doing, the system helps you along?.” Sidney D’Mello
He doesn’t want to stop at understanding what affects students—he wants to take action. D’Mello’s team of postdocs, graduate, undergraduate, and even high-school students are testing an intervention within a computer program that immediately captures eye movement to determine when and how to re-engage students when their minds begin wandering. It’s a delicate tweaking process to get the desired results and not put students on the spot by diminishing the student’s control over their own learning experience.
Part of this research involves a partnership with the Penn-Harris-Madison school district, located close to Notre Dame. In ninth-grade biology classes, the researchers will use eye-trackers to monitor and respond to a student’s state of attentiveness as he or she proceeds through an interactive tutoring system.
The program, which has received a three-year, $550,000 grant from the National Science Foundation, in collaboration with Notre Dame psychology Associate Professor James Brockmole and Matthew Kloser from the Institute of Educational Initiatives, builds upon extensive work funded by the Office of Naval Research and the Institute of Education Sciences conducted at D’Mello’s previous institution.
That research recorded 50 hours of expert human tutoring, transcribed each tape, coded approximately 50,000 student-tutor dialog moves, and then built a computational model of expert human tutoring. The model was then used in a virtual human tutor designed to supplement classroom biology instruction.
In the Penn-Harris-Madison biology classes, the system helps students to engage with material further and enables a virtual tutor to respond to their needs by closely personalizing instruction to each individual student. In this new NSF project, D’Mello and the team are working to increase the bandwidth of adaptivity by tracking and responding to students’ attentional states—and, eventually, to their emotional states as well.
“It’s completely tailored to each student, but never a substitute for the teacher,” D’Mello said. “It’s like a digital workout with the teacher as the coach.”
D’Mello and his colleagues at Notre Dame are also working on projects funded by the Gates, Raikes, Templeton, and Walton Foundations in collaboration with other top research universities, such as the University of Pennsylvania, University of Wisconsin-Madison, and University of Texas at Austin. Sample projects include an open-ended educational game for physics that tracks student emotion in computer-enabled classrooms, behavioral measures of non-intellective traits like academic diligence and frustration tolerance, and methods to fully automate the collection and coding of classroom discourse with an eye for providing formative feedback to teachers.
Focus on the Process
Work on such projects begins in the lab with precise experiments that are recorded in detail. But what sets D’Mello’s method apart is the devotion to computationally modeling data with signal processing and machine learning.
Once the models can accurately replicate human outputs, he and his team go back to the human side of the equation by asking more questions—a true blend of psychology and computer science. Then it all repeats.
That constant churn may be difficult for many, but D’Mello said it’s what draws him in to the research.
“The outcomes are great, but the iterative grind is the most fun as most of the learning occurs from our failures,” he said. “As long as we’re working on something novel, you never know what you will find.”
His unique position at Notre Dame—the first joint appointment between the College of Arts and Letters’ Department of Psychology and the College of Engineering’s Department of Computer Science and Engineering—is part of the reason his innovative approach has taken hold.
He has virtually created a whole new field of study with his research on affective computing. His interest in robotics and artificial intelligence connects our work with computer science and engineering—which just goes to show that cutting-edge, field-expanding research will be profoundly interdisciplinary, and Sidney is leading the way.” Daniel Lapsley, chair, Department of Psychology
“He has virtually created a whole new field of study with his research on affective computing, for which he is renowned,” said Daniel Lapsley, professor and chair of the Department of Psychology. “He has had a galvanizing effect on the department.
“Sidney’s research informs the work of our cognitive psychology faculty who study memory, learning, and problem-solving; and of our educational psychologists who study affect and motivation. His interest in robotics and artificial intelligence connects our work with computer science and engineering—which just goes to show that cutting-edge, field-expanding research will be profoundly interdisciplinary, and Sidney is leading the way.”
He’s doing that not only by tackling massive concepts, but also by changing perceptions. In addition to making computers more human, D’Mello said, he wants to contribute to the science behind learning and promote greater inclusion of emotion in technology.
“People think that if they’re confused, they’re stupid—but it’s the opposite,” he said. “That’s the moment when you’re primed to learn something deeply. Your system is in a state of shock that makes you look at things carefully and to problem-solve. The perception that negative emotions are always bad is absolutely is not the case.”
Imagine What’s Next
What’s possible if D’Mello succeeds in this far-reaching, diverse array of projects?
Imagine a car that knows you’re falling asleep and pulls itself over. Imagine technology that tracks affect and attention and knows when to intervene for an air traffic controller, pilot, or ship captain. Imagine 1-on-1 online, adaptive tutoring for college students that supplements every class experience.
A future like this is not far away, D’Mello said, adding that his ability to explore this exciting research agenda is due in large part to the rare joint appointment Notre Dame created for him. The University has fostered an environment of discovery and innovation, D’Mello said, which has made it easy to focus on expanding the possible.
“In whatever I do, I want to make sure my projects are far enough ahead that they aren’t obvious and are sufficiently challenging,” he said. “I want to have some ideas that may lead to something 10 or 20 years from now, rather than being constrained with what is possible today. We want to create change—not just adapt to it.”
Develop Future Leaders
His research, however, never curtails his teaching. D’Mello is able to pass his interdisciplinary passion onto students in artificial intelligence and human-computer interaction courses, where they build empathy and understanding of each other’s fields.
He also makes a point to bring his graduate students into projects with other universities to understand the importance of collaboration and remote work.
I want to have some ideas that may lead to something 10 or 20 years from now, rather than being constrained with what is possible today. We want to create change—not just adapt to it.” Sidney D’Mello
“The pace and volume of his scholarly publication is spectacular,” Lapsley said. “But it would give the wrong impression to suggest Sidney’s value lies chiefly in his prodigious research achievements. He also is a superb teacher—and the very model of the renowned and successful Notre Dame teacher-scholar who excels in both the lab and classroom, and who is a devoted mentor of undergraduates and graduate students alike.”
D’Mello’s advice to those potential students?
“Don’t be constrained by a very rigid master plan, but seek out as many interesting research opportunities as possible when you get here. You might just find your calling.”
To learn how undergraduate students can get involved with the research at D’Mello’s lab, email him at firstname.lastname@example.org.
Originally published by Fred Bauters at al.nd.edu on January 13, 2016.
Originally published by al.nd.edu on January 13, 2016.at