Intel AI tech can be used to monitor student emotions through Zoom

Intel’s “emotion AI” is designed to detect how students feel, which has drawn criticism.

What you need to know

  • Intel and Classroom Technologies are working on a set of artificial intelligence tools that can identify the emotions of students in virtual classrooms.
  • The AI feature could inform teachers when a student is confused or bored during instruction.
  • The technology has been met with pushback on moral and ethical grounds.

Intel and Classroom Technologies are working on tools that use artificial intelligence (AI) to detect the mood of children in virtual classrooms. The feature could be used to tell a teacher if a student was bored, confused, or distracted. As stated by Tom’s Hardware, the AI tool has been met with resistance from many due to the ethical and moral ramifications of monitoring students and assessing emotional states using AI (via Protocol).

The feature uses facial recognition, speech recognition, and other technologies to record people’s expressions. AI is then used to determine how the person in question is feeling.

Some believe that it can be detrimental to label people with a single word or description. Humans experience a myriad of emotions, so simplifying someone’s state to “happy” or “bored” may be counterproductive.

Additionally, people are nuanced, and expressions are not universal. For example, the same facial expression may mean different things for separate individuals.

The software, dubbed “emotion AI,” integrates with Zoom through Class, which is a software product from Classroom Technologies. Since Zoom is used frequently in education, it would be simple to implement AI tech into many virtual classrooms.

In addition to questions surrounding the accuracy and helpfulness of the technology, “emotion AI” critics question the morality of student surveillance.

Sinem Aslan, a research scientist at Intel, stated the intention behind the tech was not surveillance. “We did not start this technology as a surveillance system. In fact, we don’t want this technology to be a surveillance system.”

An advocacy group called Fight for the Future called on Zoom to stop using “emotion AI” in an open letter earlier this month.

Original Article