SurveillanceDeployment of Emotion-Recognition Technologies in China Threatens Human Rights

Published 19 February 2021

Emotion recognition is a biometric technology which purports to be able to analyze a person’s inner emotional state. These biometric applications are used by law enforcement authorities to identify suspicious individuals, and by schools to monitor how well a student is paying attention in class. China is deploying the technology to allow the authorities to better monitor forbidden anti-regime thoughts among citizens who are subject to police interrogation or investigation.

Emotion recognition is a biometric technology which purports to be able to analyze a person’s inner emotional state. These biometric applications are used in a number of ways. For instance, it is used by law enforcement authorities to identify suspicious individuals; by schools to monitor how well a student is paying attention in class; and by private companies to determine people’s access to credit.

A new report released by Article19 shows the ways in which these technologies are currently marketed and used in China and why the international community should take note. Article19 says the wider rollout of the technology should be effectively resisted through careful strategy of addressing the design, development, sale and use of emotion recognition technologies.

The report emphasizes that the timing of such resistance – before these technologies become more commonplace – is important for the effective promotion and protection of people’s rights, including free access to information and free speech.

Vidushi Marda, a lawyer and digital researcher for ARTICLE 19, said: “High school students should not fear the collection of data on their concentration levels and emotions in classrooms, just as suspects undergoing police interrogation must not have assessments of their emotional states used against them in an investigation. It is imperative that we unpack how these technologies are being used and assess what impact they are likely to have internationally before they become more widespread.

“While some stakeholders claim that these technologies will improve with time, we believe that their design, development, deployment, sale and transfers should be banned due to the racist foundations and fundamental incompatibility with human rights.”

Some of the main findings from the research on deployment of emotion recognition technologies in China include the following:

·  The design, development, sale, and use of emotion recognition technologies are inconsistent with international human rights standards, including how they are used to surveil, monitor, control access to opportunities, and impose power.

·  The invisible, opaque, and unfettered manner in which emotion recognition is being developed risks depriving people of their rights to freedom of expression, privacy, and the right to dissent through protest, amongst others.

·  Emotion recognition’s pseudoscientific foundations render this technology untenable.

·  Chinese law enforcement and public security bureaus are attracted to using emotion recognition software as an interrogative and investigatory tool.

·  While some emotion recognition technology companies allege they can detect sensitive attributes, such as mental health conditions and race, none have addressed the potentially discriminatory consequences of collecting this information in conjunction with emotion data.