• AI Could Be a Force for Positive Social Change – but We’re Currently Heading for a Darker Future

    Artificial Intelligence (AI) is already re-configuring the world in conspicuous ways. Data drives our global digital ecosystem, and AI technologies reveal patterns in data. Smartphones, smart homes, and smart cities influence how we live and interact, and AI systems are increasingly involved in recruitment decisions, medical diagnoses, and judicial verdicts. Whether this scenario is utopian or dystopian depends on your perspective.

  • A Safer Way for Police to Test Drug Evidence

    Scientists have demonstrated a way for police to quickly and safely test whether a baggie or other package contains illegal drugs without having to handle any suspicious contents directly. The new technique can limit the risk of accidental exposure to fentanyl and other highly potent drugs that can be dangerous if a small amount is accidentally inhaled.

  • The FISA Oversight Hearing Confirmed That Things Need to Change

    Section 215, the controversial law at the heart of the NSA’s massive telephone records surveillance program, is set to expire in December. Last week the House Committee on the Judiciary held an oversight hearing to investigate how the NSA, FBI, and the rest of the intelligence community are using and interpreting 215 and other expiring national security authorities. If last week’s hearing made anything clear, it’s this: there is no good reason for Congress to renew the CDR authority,” McKinney writes, adding: “Despite repeated requests from the members of the panel to describe some way of measuring how effective these surveillance laws are, none of the witnesses could provide a framework. Congress must be able to determine whether any of the programs have real value and if the agencies are respecting the foundational rights to privacy and civil liberties that protect Americans from government overreach.”

  • Border Communities Inundated with Surveillance Technologies

    The Electronic Frontier Foundation (EFF) the other day published The Atlas of Surveillance: Southwestern Border Communities. The Atlas consists of profiles of six counties along the U.S.-Mexico border, outlining the types of surveillance technologies deployed by local law enforcement—including drones, body-worn cameras, automated license plate readers, and face recognition. The report also includes a set of 225 data points marking surveillance by local, state, and federal agencies in the border region.

  • AI and the Coming of the Surveillance State

    Artificial Intelligence (AI) used to be the stuff of science fiction, but is now making its presence felt in both the private and the public domains. In an important new study — The Global Expansion of AI Surveillance – Steve Feldstein of the Carnegie Endowment writes: “Unsurprisingly, AI’s impact extends well beyond individual consumer choices. It is starting to transform basic patterns of governance, not only by providing governments with unprecedented capabilities to monitor their citizens and shape their choices but also by giving them new capacity to disrupt elections, elevate false information, and delegitimize democratic discourse across borders.”

  • I Researched Uighur Society in China for 8 Years and Watched How Technology Opened New Opportunities – Then Became a Trap

    The Uighurs, a Muslim minority ethnic group of around 12 million in northwest China, are required by the police to carry their smartphones and IDs listing their ethnicity. As they pass through one of the thousands of newly built digital media and face surveillance checkpoints located at jurisdictional boundaries, entrances to religious spaces and transportation hubs, the image on their ID is matched to their face. If they try to pass without these items, a digital device scanner alerts the police. The Chinese state authorities described the intrusive surveillance as a necessary tool against the “extremification” of the Uighur population. Through this surveillance process, around 1.5 million Uighurs and other Muslims were determined “untrustworthy” and have forcibly been sent to detention and reeducation in a massive internment camp system. Since more than 10 percent of the adult population has been removed to these camps, hundreds of thousands of children have been separated from their parents. Many children throughout the region are now held in boarding schools or orphanages which are run by non-Muslim state workers.

  • Faster, Smarter Security Screening Systems

    By now, attendees to sporting events, visitors to office buildings, and especially frequent fliers are all quite familiar with the technologies used at security checkpoints. You arrive at the security checkpoint, check your bags, show your ID and maybe your ticket or boarding pass, throw away the coffee or water you’ve been chugging, and then wait in a long line until it is your turn to be screened. The security lines can be inconvenient. S&T and partners are working to help security screening systems, whether at airports, government facilities, border checkpoints, or public spaces like arenas, to work faster and smarter.

  • We Need to Ban More Emerging Technologies

    With more and more innovation, there is less and less time to reflect on the consequences. To tame this onrushing tide, society needs dams and dikes. Just as has begun to happen with facial recognition, it’s time to consider legal bans and moratoriums on other emerging technologies. These need not be permanent or absolute, but innovation is not an unmitigated good. The more powerful a technology is, the more care it requires to safely operate.

  • Facial Recognition: Ten Reasons You Should Be Worried About the Technology

    Facial recognition technology is spreading fast. Already widespread in China, software that identifies people by comparing images of their faces against a database of records is now being adopted across much of the rest of the world. It’s common among police forces but has also been used at airports, railway stations and shopping centers. The rapid growth of this technology has triggered a much-needed debate. Activists, politicians, academics and even police forces are expressing serious concerns over the impact facial recognition could have on a political culture based on rights and democracy.

  • Data Leviathan: China’s Burgeoning Surveillance State

    Classical totalitarianism, in which the state controls all institutions and most aspects of public life, largely died with the Soviet Union, apart from a few holdouts such as North Korea. The Chinese Communist Party retained a state monopoly in the political realm but allowed a significant private economy to flourish. Yet today, in Xinjiang, a region in China’s northwest, a new totalitarianism is emerging—one built not on state ownership of enterprises or property but on the state’s intrusive collection and analysis of information about the people there. Xinjiang shows us what a surveillance state looks like under a government that brooks no dissent and seeks to preclude the ability to fight back. And it demonstrates the power of personal information as a tool of social control.

  • How to Fight the New Domestic Terrorism

    Pittsburgh, Tallahassee, Poway, Jeffersontown and now El Paso—these American communities have been the scene since 2018 of the most lethal mass shootings connected to white supremacist ideology, but there have been many other lesser attacks and foiled plots. In the U.S., such terrorism has now eclipsed international jihadist terrorism in both frequency and severity. Clint Watts writes in the Wall Street Journal that the formula for responding to America’s white supremacist terrorism emergency is quite clear—in part because of the U.S. hard-won experience fighting jihadists from al Qaeda and its spawn, Islamic State. “We must swiftly and carefully apply the best practices of the two decades since Sept. 11, 2001, to counter this decade’s domestic terrorist threat—by passing new laws, increasing resources and enhancing investigative capabilities,” he writes.

  • Shoppers Targeted by Face‑Recognition Cameras in “Epidemic” of Surveillance

    There is an “epidemic” of facial recognition surveillance technology at privately owned sites in Britain, campaigners say. Big Brother Watch, a civil liberties group, found shopping centers, museums, conference centers and casinos had all used the software that compares faces captured by CCTV to those of people on watch lists, such as suspected terrorists or shoplifters. Privacy campaigners have criticized trials of the technology by police in London and Wales, questioning their legal basis.

  • Military-Style Surveillance Technology Is Being Tested in American Cities

    What if you fly a helicopter over the city at 1,000 feet. Now, with your telescopic camera, you can even make out distinctive features of the people in your frame. Surely this isn’t legal, you might say. Surely a bright line exists between snapping a photo with your phone from an airplane window and focusing a telescopic lens a few hundred feet over someone’s backyard. But it doesn’t. This is because airspace over America falls into the same legal category as other public spaces, such as sidewalks, roads, parks, and beaches—and it isn’t illegal to take photographs of private property, or private citizens, from public space. As a result, we have no expectation of privacy from above.

  • Bullet shape, Velocity Determine Blood Spatter Patterns

    Blood spatters are hydrodynamic signatures of violent crimes, often revealing when an event occurred and where the perpetrator and victim were located at the time of the crime. Gaining a better physical understanding of the fluid dynamical phenomena at play during gunshot spatters could enhance crime scene investigations.

  • Cities Ban Government Use of Facial Recognition

    Oakland, Calif., last week became the third city in America to ban the use of facial recognition technology in local government, following prohibitions enacted earlier this year in San Francisco and Somerville, Mass. Berkeley, Calif., is also weighing a ban. The technology is often inaccurate, especially when identifying people who aren’t white men.