-
Why Adding Client-Side Scanning Breaks End-To-End Encryption
Recent attacks on encryption have diverged. On the one hand, we’ve seen Attorney General William Barr call for “lawful access” to encrypted communications, using arguments that have barely changed since the 1990’s. Erica Portnoy writes that we’ve also seen suggestions from a different set of actors for more purportedly “reasonable” interventions, particularly the use of client-side scanning to stop the transmission of contraband files, most often child exploitation imagery (CEI).
-
-
Why Did Microsoft Fund an Israeli Firm that Surveils West Bank Palestinians?
Microsoft has invested in AnyVision, an Israeli startup which has developed a facial recognition technology used by Israel’s military and intelligence services to surveil Palestinians throughout the West Bank, in spite of the tech giant’s public pledge to avoid using the technology if it encroaches on democratic freedoms. The surveillance technology lets customers identify individuals and objects in any live camera feed, such as a security camera or a smartphone, and then track targets as they move between different feeds. The Israeli surveillance project is similar to China’s surveillance of its Uighur minority population. China is using artificial intelligence and facial recognition technology for a pervasive, intrusive monitoring of the Uighurs, a Muslim group living in western China.
-
-
Why We Must Ban Facial Recognition Software Now
Facial recognition technology, once a darling of Silicon Valley with applications for policing, spying and authenticating identities, is suddenly under fire. Conservative Republicans and liberal Democrats have strongly criticized the technology. San Francisco, Oakland, Berkeley and Somerville, Mass., have barred all of their government agencies, including the police, from using it. And several Democratic candidates for president have raised deep concerns about it, with one, Senator Bernie Sanders, calling for an outright ban for policing.
-
-
AI Could Be a Force for Positive Social Change – but We’re Currently Heading for a Darker Future
Artificial Intelligence (AI) is already re-configuring the world in conspicuous ways. Data drives our global digital ecosystem, and AI technologies reveal patterns in data. Smartphones, smart homes, and smart cities influence how we live and interact, and AI systems are increasingly involved in recruitment decisions, medical diagnoses, and judicial verdicts. Whether this scenario is utopian or dystopian depends on your perspective.
-
-
A Safer Way for Police to Test Drug Evidence
Scientists have demonstrated a way for police to quickly and safely test whether a baggie or other package contains illegal drugs without having to handle any suspicious contents directly. The new technique can limit the risk of accidental exposure to fentanyl and other highly potent drugs that can be dangerous if a small amount is accidentally inhaled.
-
-
The FISA Oversight Hearing Confirmed That Things Need to Change
Section 215, the controversial law at the heart of the NSA’s massive telephone records surveillance program, is set to expire in December. Last week the House Committee on the Judiciary held an oversight hearing to investigate how the NSA, FBI, and the rest of the intelligence community are using and interpreting 215 and other expiring national security authorities. If last week’s hearing made anything clear, it’s this: there is no good reason for Congress to renew the CDR authority,” McKinney writes, adding: “Despite repeated requests from the members of the panel to describe some way of measuring how effective these surveillance laws are, none of the witnesses could provide a framework. Congress must be able to determine whether any of the programs have real value and if the agencies are respecting the foundational rights to privacy and civil liberties that protect Americans from government overreach.”
-
-
Border Communities Inundated with Surveillance Technologies
The Electronic Frontier Foundation (EFF) the other day published The Atlas of Surveillance: Southwestern Border Communities. The Atlas consists of profiles of six counties along the U.S.-Mexico border, outlining the types of surveillance technologies deployed by local law enforcement—including drones, body-worn cameras, automated license plate readers, and face recognition. The report also includes a set of 225 data points marking surveillance by local, state, and federal agencies in the border region.
-
-
AI and the Coming of the Surveillance State
Artificial Intelligence (AI) used to be the stuff of science fiction, but is now making its presence felt in both the private and the public domains. In an important new study — The Global Expansion of AI Surveillance – Steve Feldstein of the Carnegie Endowment writes: “Unsurprisingly, AI’s impact extends well beyond individual consumer choices. It is starting to transform basic patterns of governance, not only by providing governments with unprecedented capabilities to monitor their citizens and shape their choices but also by giving them new capacity to disrupt elections, elevate false information, and delegitimize democratic discourse across borders.”
-
-
I Researched Uighur Society in China for 8 Years and Watched How Technology Opened New Opportunities – Then Became a Trap
The Uighurs, a Muslim minority ethnic group of around 12 million in northwest China, are required by the police to carry their smartphones and IDs listing their ethnicity. As they pass through one of the thousands of newly built digital media and face surveillance checkpoints located at jurisdictional boundaries, entrances to religious spaces and transportation hubs, the image on their ID is matched to their face. If they try to pass without these items, a digital device scanner alerts the police. The Chinese state authorities described the intrusive surveillance as a necessary tool against the “extremification” of the Uighur population. Through this surveillance process, around 1.5 million Uighurs and other Muslims were determined “untrustworthy” and have forcibly been sent to detention and reeducation in a massive internment camp system. Since more than 10 percent of the adult population has been removed to these camps, hundreds of thousands of children have been separated from their parents. Many children throughout the region are now held in boarding schools or orphanages which are run by non-Muslim state workers.
-
-
Faster, Smarter Security Screening Systems
By now, attendees to sporting events, visitors to office buildings, and especially frequent fliers are all quite familiar with the technologies used at security checkpoints. You arrive at the security checkpoint, check your bags, show your ID and maybe your ticket or boarding pass, throw away the coffee or water you’ve been chugging, and then wait in a long line until it is your turn to be screened. The security lines can be inconvenient. S&T and partners are working to help security screening systems, whether at airports, government facilities, border checkpoints, or public spaces like arenas, to work faster and smarter.
-
-
We Need to Ban More Emerging Technologies
With more and more innovation, there is less and less time to reflect on the consequences. To tame this onrushing tide, society needs dams and dikes. Just as has begun to happen with facial recognition, it’s time to consider legal bans and moratoriums on other emerging technologies. These need not be permanent or absolute, but innovation is not an unmitigated good. The more powerful a technology is, the more care it requires to safely operate.
-
-
Facial Recognition: Ten Reasons You Should Be Worried About the Technology
Facial recognition technology is spreading fast. Already widespread in China, software that identifies people by comparing images of their faces against a database of records is now being adopted across much of the rest of the world. It’s common among police forces but has also been used at airports, railway stations and shopping centers. The rapid growth of this technology has triggered a much-needed debate. Activists, politicians, academics and even police forces are expressing serious concerns over the impact facial recognition could have on a political culture based on rights and democracy.
-
-
Data Leviathan: China’s Burgeoning Surveillance State
Classical totalitarianism, in which the state controls all institutions and most aspects of public life, largely died with the Soviet Union, apart from a few holdouts such as North Korea. The Chinese Communist Party retained a state monopoly in the political realm but allowed a significant private economy to flourish. Yet today, in Xinjiang, a region in China’s northwest, a new totalitarianism is emerging—one built not on state ownership of enterprises or property but on the state’s intrusive collection and analysis of information about the people there. Xinjiang shows us what a surveillance state looks like under a government that brooks no dissent and seeks to preclude the ability to fight back. And it demonstrates the power of personal information as a tool of social control.
-
-
How to Fight the New Domestic Terrorism
Pittsburgh, Tallahassee, Poway, Jeffersontown and now El Paso—these American communities have been the scene since 2018 of the most lethal mass shootings connected to white supremacist ideology, but there have been many other lesser attacks and foiled plots. In the U.S., such terrorism has now eclipsed international jihadist terrorism in both frequency and severity. Clint Watts writes in the Wall Street Journal that the formula for responding to America’s white supremacist terrorism emergency is quite clear—in part because of the U.S. hard-won experience fighting jihadists from al Qaeda and its spawn, Islamic State. “We must swiftly and carefully apply the best practices of the two decades since Sept. 11, 2001, to counter this decade’s domestic terrorist threat—by passing new laws, increasing resources and enhancing investigative capabilities,” he writes.
-
-
Shoppers Targeted by Face‑Recognition Cameras in “Epidemic” of Surveillance
There is an “epidemic” of facial recognition surveillance technology at privately owned sites in Britain, campaigners say. Big Brother Watch, a civil liberties group, found shopping centers, museums, conference centers and casinos had all used the software that compares faces captured by CCTV to those of people on watch lists, such as suspected terrorists or shoplifters. Privacy campaigners have criticized trials of the technology by police in London and Wales, questioning their legal basis.
-
More headlines
The long view
Forensic Science Method for Firearm Identification Is Flawed
Like fingerprints, a firearm’s discarded shell casings have unique markings. This allows forensic experts to compare casings from a crime scene with those from a suspect’s gun. Finding and reporting a mismatch can help free the innocent, just as a match can incriminate the guilty. But a new study reveals mismatches are more likely than matches to be reported as “inconclusive” in cartridge-case comparisons.