SurveillanceChicago should reject a proposal for private-sector face surveillance
A proposed amendment to the Chicago municipal code would allow businesses to use face surveillance systems that could invade biometric and location privacy, and violate a pioneering state privacy law adopted by Illinois a decade ago. EFF joined a letter with several allied privacy organizations explaining the EFF’s concerns, which include issues with both the proposed law and the invasive technology it would irresponsibly expand.
A proposed amendment to the Chicago municipal code would allow businesses to use face surveillance systems that could invade biometric and location privacy, and violate a pioneering state privacy law adopted by Illinois a decade ago. EFF joined a letter with several allied privacy organizations explaining our concerns, which include issues with both the proposed law and the invasive technology it would irresponsibly expand.
At its core, facial recognition technology is an extraordinary menace to our digital liberties. Unchecked, the expanding proliferation of surveillance cameras, coupled with constant improvements in facial recognition technology, can create a surveillance infrastructure that the government and big companies can use to track everywhere we go in public places, including who we are with and what we are doing.
This system will deter law-abiding people from exercising their First Amendment rights in public places. Given continued inaccuracies in facial recognition systems, many people will be falsely identified as dangerous or wanted on warrants, which will subject them to unwanted—and often dangerous—interactions with law enforcement. This system will disparately burden people of color, who suffer a higher “false positive” rate due to additional flaws in these emerging systems.
In short, police should not be using facial recognition technology at all. Nor should businesses that wire their surveillance cameras into police spying networks.
Moreover, the Chicago ordinance would violate the Illinois Biometric Information Privacy Act (BIPA). This state law, adopted by Illinois statewide in 2008, is a groundbreaking measure that set a national standard. It requires companies to gain informed, opt-in consent from any individual before collecting biometric information from that person, or disclosing it to a third party. It also requires companies to store biometric information securely, sets a three-year limit on retaining information before it must be deleted, and empowers individuals whose rights are violated to enforce its provisions in court.
Having overcome several previous attempts to rescind or water down its requirements at the state level, BIPA now faces a new threat in a recently proposed municipal amendment in Chicago. The proposal to add a section on “Face Geometry Data” to the city’s municipal code would allow businesses to use controversial and discriminatory face surveillance systems pursuant to licensing agreements with the Chicago Police Department.
As the letter we joined makes clear, the proposal suffers from numerous defects.
For example, the proposal does not effectively limit authorized uses. While it prohibits “commercial uses” of biometric information, it authorizes “security purposes.” That distinction is meaningless in the context of predictable commercial security efforts, like for-profit mining and deployment of face recognition data to prevent shoplifting. The attempt to differentiate permissible from impermissible uses also rings hollow because the proposal in no way restricts how biometric data can be shared with other companies, who might not be subject to Chicago’s municipal regulation.
Contradicting the consent required by Illinois BIPA, the Chicago ordinance would allow businesses to collect biometric information from customers and visitors without their consent, by merely posting signs giving patrons notice about some—but not all—of their surveillance practices. In particular, the required notice would need not address corporate use of biometric information beyond in-store collection. It would also fail to inform customers who are visually impaired.
The Chicago proposal also invites misuse by the police department, which would face no reporting requirements. Transparency is critical, especially given Chicago’s unfortunate history of racial profiling, and other police misconduct (which includes unconstitutionally detaining suspects without access to counsel, and torturing hundreds of African-American suspects into false confessions). Even in cities with fewer historical problems, police secrecy is incompatible with the trend elsewhere across the country towards greater transparency and accountability in local policing.
Also, despite the documented susceptibility of face recognition systems to discrimination and bias, the Chicago ordinance would not require any documentation of, for instance, how often biometric information collected from businesses may be used to inaccurately identify a supposed criminal suspect. And it would violate BIPA’s requirements for data retention limits and secure data storage.
We oppose the proposed municipal code amendment in Chicago. We hope you will join us in encouraging the city’s policymakers to reject the proposal. It would violate existing and well-established state law. More importantly, businesses working hand-in-glove with police surveillance centers should not be imposing facial recognition on their patrons—especially under an ordinance as unprotective as the one proposed in Chicago.
Shahid Buttar is Director of Grassroots Advocacy at EFF. This article is published courtesy of the Electronic Frontier Foundation (EFF)