Skip to main content

Privacy groups slam Zoom’s plans to monitor your emotions

Imagine you’re on a video call with your boss. You’re tired and stressed, but you’re trying to hold things together and be as professional as possible given the circumstances. Now imagine your boss knows all this because they have an app that is actively monitoring your mood and feeding back about your emotional state in real time. They know how you’re feeling even if you don’t say it. Their software is scanning and watching.

This might sound dystopian but could soon become a reality. Companies are already working on adding emotional analysis to their apps, and it’s something video-conferencing app Zoom is also considering, although it hasn’t yet put it into action. A group of 28 privacy-focused organizations is trying to make Zoom change course and abandon the project.

A person conducting a Zoom call on a laptop while sat at a desk.
Zoom

The group, including Fight for the Future, the Electronic Privacy Information Center, and the ACLU, has penned an open letter asking Zoom to reconsider its plans to add emotion-tracking software to its app. Not only does the group say that the move is “based on the false idea that AI can track and analyze human emotions,” but that it would be a “violation of privacy and human rights.”

The open letter goes on to lay out five reasons Zoom’s emotion-tracking project should be dropped, claiming that the tech:

  • Is pseudoscience that is not supported by evidence, and has been disputed by academics
  • Is discriminatory and assumes all people use the same body language and voice patterns
  • Is manipulative and could be used to “monitor users and mine them for profit,” for example by playing on their emotions to sell a product. Zoom is already using artificial intelligence to monitor sales calls
  • Is punitive and might lead to people being disciplined for “expressing the wrong emotions”
  • Is a data security risk, since the information collected is deeply personal and could be hacked

The open letter then goes on to commend Zoom for its actions in the past, such as when it reversed its decision to block free users from its encrypted service, or when the company canceled its plans to add face-tracking features to its app.

Zoom is no stranger to privacy controversies. Earlier this year a bug caused its Mac app to surreptitiously record users’ audio without permission, and Google even banned its employees from using the app in 2020 over privacy concerns. The good news, though, is there are ways to increase your Zoom privacy and security.

The open letter’s signatories have asked Zoom to publicly commit to not including emotion-tracking software in its app by May 20, 2022. We’ve reached out to Zoom for comment and will include the company’s response when we receive a reply.

Alex Blake
In ancient times, people like Alex would have been shunned for their nerdy ways and strange opinions on cheese. Today, he…
A dangerous new jailbreak for AI chatbots was just discovered
the side of a Microsoft building

Microsoft has released more details about a troubling new generative AI jailbreak technique it has discovered, called "Skeleton Key." Using this prompt injection method, malicious users can effectively bypass a chatbot's safety guardrails, the security features that keeps ChatGPT from going full Taye.

Skeleton Key is an example of a prompt injection or prompt engineering attack. It's a multi-turn strategy designed to essentially convince an AI model to ignore its ingrained safety guardrails, "[causing] the system to violate its operators’ policies, make decisions unduly influenced by a user, or execute malicious instructions," Mark Russinovich, CTO of Microsoft Azure, wrote in the announcement.

Read more