ARGUMENT: Precautionary principle Ban Use of Affective Computing in Federal Law Enforcement

Published 9 August 2021

Affective computing uses algorithms to analyze bodies, faces, and voices to infer human emotion and state of mind. Even though there clearly needs to be more research done on this technology, law enforcement agencies are starting to experiment with it to extract information, detect deception, and identify criminal behavior. Alex Engler says that President Biden should ban affective computing before it starts to threaten civil liberties.

Affective computing is an interdisciplinary field which uses algorithms – or, more precisely, a broad set of technologies which use data and algorithms— to recognize and analyze bodies, faces, and voices to infer human emotion and state of mind. Alex Engler writes for Brookings that many questions remain unanswered, but there are plausible valuable contributions of affective computing that warrant further research.

Law enforcement agencies and companies have been experimenting with using affective computing to extract personality information, detect deception, and identify criminal behavior, which, Engler writes, should be a cause for worry, because

there is insufficient evidence that these technologies work reliably enough to be used for the high stakes of law enforcement. Even worse, they threaten core American principles of civil liberty in a pluralistic society by presuming that facial movements, physical reactions, and tone of voice can be evidence of criminality. The Biden administration should publicly and unequivocally reject this perspective by banning the use of affective computing in federal law enforcement.


Despite the high stakes and lack of evidence of efficacy, there is reason to be concerned that law enforcement will implement affective computing…. While some local law enforcement has been less hesitant, it seems that federal law enforcement agencies have largely restrained from deploying affecting computing systems, despite some testing and experimenting.


wider ban of affective computing for high-stakes decisions, such as in hiring and college admissions, would be delayed by need for congressional approval. A complete ban in all federal agencies might also undermine the potential for its valuable use. However, there is nothing preventing President Biden from issuing an executive order that bans the use of affective computing in federal law enforcement agencies.

Engler notes that public policy in the United States is typically reluctant to implement the precautionary principle, that is, banning a technology before its harm is widely demonstrated, but the argument for banning affective computing in law enforcement is clear. Since affective computing is concerned with emotional and personality analysis, this ban would not prevent the use of speech-to-text transcription, weapon detection, or facial recognition.

In the future, if independent research demonstrates validity and efficacy of a specific affective technology, an exception could easily be made. Until then, by banning affecting computing in federal law enforcement, President Biden has an opportunity to set a positive example to the United States and to the world about the ethical use of high-stakes artificial intelligence technologies.