News

Sensors at CMU: ClassInSight project scans lectures, analyzes behavior

Credit: Courtesy of ClassInSight Credit: Courtesy of ClassInSight

40 classrooms on the Carnegie Mellon campus have two cameras mounted on white plastic high on the walls, one pointed towards the professor at the front, and one looking, like the professor herself might be, back at the students.

Most of the cameras and sensors around Carnegie Mellon have a little email pasted somewhere near them, for passersby to ask about the kinds of information that might be collected. The note above these classroom cameras directs onlookers to the organizers of the ClassInSight project.

ClassInSight uses these cameras to generate stick figures of all of the people in class, which turn into data about things like where a professor’s gaze is directed, whether they hold a pause after asking a question, or how many students raise their hands.

A group of Carnegie Mellon human-computer interaction researchers, in collaboration with the school, the registrar, and the university’s Eberly Center for Teaching Excellence & Educational Innovation began this project in 2015 with the purpose of generating automated reports to give professors feedback on their teaching styles.

“One of the issues that college instructors have, that high school teachers often don’t, is that we are expected to teach without having a degree in education,” said Amy Ogan, Thomas and Lydia Moran Assistant Professor of Learning Science, who heads the project. ClassInSight prompts professors with strategies based on collected data to encourage class engagement and discussion, which can be rare in large college lectures.

Motivated teachers can ask the Eberly Center to send someone in to sit on their lectures, and receive feedback. But Ogan says that though this might be a helpful resource, it happens at most once a semester, while the ClassInSight system can provide daily feedback.

The project team includes Ogan, David Gerritsen, a researcher at the Eberly Center, John Zimmerman and Chris Harrison at the Human-Computer Interaction Institute, and Yuvraj Agarwal from the School of Computer Science. The project received a five-year, $2.5 million dollar research grant in 2018 from the James S. McDonnell Foundation’s Teachers as Learners program.

The McDonnell Foundation grant focuses on improvements to K-12 education, expanding the scope of ClassInSight out of the college classroom. Unlike college professors, there is “pressure in high school for teachers’ performance to be observed and calculated as, for instance, part of whether they are hired again,” which, Ogan says, can lead them to be wary of measures that might track classroom behaviors.

Though the cameras were installed in 40 classrooms in 2018, they collect data in around 10 to 15 classrooms each semester, depending on how many instructors want to participate.

A couple weeks into a class, someone will come in to read the Institutional Review Board (IRB)-approved script, telling the students about the type of data that is being collected. The sensors are turned on for the duration of the lecture, collecting anonymized information about the position, gestures, and expressions of the people in the classroom.

Most images are deleted after the anonymized gesture data is recorded. However, while the image recognition algorithm is being developed, a portion of the raw images are stored to be reviewed for accuracy. The IRB mandates that the extracted data be kept for a minimum of three years, without identifying information about the professor or the students.

Students are given an opportunity to object, at which point the project will not continue in the classroom but Ogan says few students ever have.

Grace Bae, a junior in the statistics and data science department, says that she “didn’t really think twice” about the cameras that are to be turned on in her Designing Human Centered Software class, since the data would be anonymized. She says Harrison, her professor and ClassInSight researcher, was “open and communicative about what these cameras would do and their purpose.”

Kalvin Chang, a sophomore in the School of Computer Science, was also not bothered because of anonymization, but he “suspect[s] some might be creeped out by the notion of AI tracking them.”

Very few people, says Ogan, have reached out to the email listed on the cameras themselves. She says this could be because people have had a discussion about the cameras in their lectures. But also, Ogan continues, “There are people counters on all of the computer labs on campus,” and students might be “getting used to” the presence of sensors.

This is the first article in a series we’re calling “Sensors at CMU,” exploring the projects behind the cameras and sensors around campus.