Zoom May Design Tool for Detecting Emotions During Video Calls

Zoom has become the industry leader in video conferencing technology since the onset of the COVID-19 work-from-home trends began. The company is now reportedly considering development of an artificial intelligence product that detects the emotions of users by examining facial expressions and voice tones.

Human rights groups are warning that the technology could easily lead to improper invasion of privacy. The Thomson Reuters Foundation reported that more than 25 organizations sent a joint letter to Zoom CEO Eric Yuan on Wednesday urging the company to forego the plan.

Digital rights organization Fight for the Future’s director Caitlin Seeley George said that the proposed feature will discriminate against people with disabilities and of certain ethnicities. She said the technology would “hardcode stereotypes into millions of devices.”

George added that the tool would allow “mining users for profit” and could easily lead to “far more sinister and punitive uses.”

Zoom reported more than 200 million daily users of its product during the height of the COVID-19 pandemic in 2020.

The company has developed software already that it claims has the power to analyze the sentiment of meetings using text transcripts. It has said that it intends to explore the development of new tools that will be able to measure human emotions in real time.

In a company blog post, Zoom said that the emotional reading technology will help salespeople and others do better work by providing instant feedback on the emotional tone of participants.

The joint letter sent this week by the human rights groups said that detecting emotional responses in a commercial setting would constitute a “violation of privacy and human rights.” The letter urged Zoom to end its plans of moving forward with the technology.

Critics of emotional recognition tools have compared the technology to facial recognition tools now used to verify identities. That software has been shown to have significant error rates on the faces of persons of color, sometimes leading to wrongful identifications and arrests.

ACLU Speech, Privacy, and Technology Project deputy director Esha Bhandari described the proposed emotion detection artificial intelligence software as “junk science.” He said there is “no good reason” for Zoom or any company to “mine facial expressions, vocal tones, and eye movements” as part of the “creepy technology.”