• I Researched Uighur Society in China for 8 Years and Watched How Technology Opened New Opportunities – Then Became a Trap

    The Uighurs, a Muslim minority ethnic group of around 12 million in northwest China, are required by the police to carry their smartphones and IDs listing their ethnicity. As they pass through one of the thousands of newly built digital media and face surveillance checkpoints located at jurisdictional boundaries, entrances to religious spaces and transportation hubs, the image on their ID is matched to their face. If they try to pass without these items, a digital device scanner alerts the police. The Chinese state authorities described the intrusive surveillance as a necessary tool against the “extremification” of the Uighur population. Through this surveillance process, around 1.5 million Uighurs and other Muslims were determined “untrustworthy” and have forcibly been sent to detention and reeducation in a massive internment camp system. Since more than 10 percent of the adult population has been removed to these camps, hundreds of thousands of children have been separated from their parents. Many children throughout the region are now held in boarding schools or orphanages which are run by non-Muslim state workers.

  • Facial Recognition: Ten Reasons You Should Be Worried About the Technology

    Facial recognition technology is spreading fast. Already widespread in China, software that identifies people by comparing images of their faces against a database of records is now being adopted across much of the rest of the world. It’s common among police forces but has also been used at airports, railway stations and shopping centers. The rapid growth of this technology has triggered a much-needed debate. Activists, politicians, academics and even police forces are expressing serious concerns over the impact facial recognition could have on a political culture based on rights and democracy.

  • Facial Recognition “Epidemic” in the U.K.

    An investigation by the London-based Big Brother Watch has uncovered what the organization describes as a facial recognition “epidemic” across privately owned sites in the United Kingdom. The civil liberties campaign group has found major property developers, shopping centers, museums, conference centers and casinos using the technology in the United Kingdom.

  • Cities Ban Government Use of Facial Recognition

    Oakland, Calif., last week became the third city in America to ban the use of facial recognition technology in local government, following prohibitions enacted earlier this year in San Francisco and Somerville, Mass. Berkeley, Calif., is also weighing a ban. The technology is often inaccurate, especially when identifying people who aren’t white men.

  • Any single hair from the human body can be used for identification

    Any single hair from anywhere on the human body can be used to identify a person. This conclusion is one of the key findings from a nearly year-long study by a team of researchers. The study could provide an important new avenue of evidence for law enforcement authorities in sexual assault cases.

  • Lawmakers raise alarm over CBP’s use of facial recognition tech on American citizens

    Lawmakers last week sent a letter to acting DHS secretary, sounding the alarm over reports that U.S. Customs and Border Protection (CBP) is using facial recognition technology to scan American citizens — raising concerns over privacy and potential misuse of the American people’s biometric data.

  • Finding fake fingerprints

    From a security perspective, what’s to stop a third party “lifting” your fingerprint, and creating a facsimile of its loops, whorls and arches with a piece of a skin-like rubbery material and then presenting this to the biometric device to gain access? The simple answer is nothing!

  • Dystopian Future Watch: Is San Francisco’s facial recognition ban too little, too late?

    Life just keeps creeping along, leading us step-by-step closer to living in a Philip K. Dick dystopian future—in real-time. And often, in our surveillance culture, we are willing participants to work alongside Big Brother. “Remember how fun it used to be to see facial recognition and retina scanning in sci-fi movies?” Hermon Leon asks in the Observer. “We loved it in RoboCop and Blade Runner, right? Now, many of these biometric technologies have become a nightmarish reality. Let’s take a look.”

  • Rapid DNA technology ID’ed California wildfire victims

    Amid the chaos and devastation of a mass casualty evet, medical examiners often provide closure as they identify victims in the aftermath, but their ability to do this quickly can vary depending on the size, scope, and type of disaster. Such challenges were the case following the Camp Fire wildfire that killed eighty-five people and devastated communities in Paradise, California, in the fall of 2018. S&T’s Rapid DNA technology became the first resort as it provided identifying information in under two hours when dental records and fingerprints weren’t available.

  • Abundance of DNA evidence insufficient to prevent wrongful convictions

    As we enter an era in which DNA evidence is routinely used in criminal investigations, errors that led to wrongful convictions—including mistakes later corrected with DNA tests—may seem to be fading into history. This, however, is not true, says an expert.

  • Human brains vulnerable to voice morphing attacks

    A recent research study investigated the neural underpinnings of voice security, and analyzed the differences in neural activities when users are processing different types of voices, including morphed voices.The results? Not pleasing to the ear. Or the brain.

  • Disguises are surprisingly effective

    Superficial but deliberate changes in someone’s facial appearance – such as a new hairstyle or complexion - are surprisingly effective in identity deception, new research suggests.

  • Machine learning masters the fingerprint to fool biometric systems

    Fingerprint authentication systems are a widely trusted, ubiquitous form of biometric authentication, deployed on billions of smartphones and other devices worldwide. Yet a new study reveals a surprising level of vulnerability in these systems.

  • Federal researchers complete second round of problematic tattoo recognition experiments

    Despite igniting controversy over ethical lapses and the threat to civil liberties posed by its tattoo recognition experiments the first time around, the National Institute of Standards and Technology (NIST) recently completed its second major project evaluating software designed to reveal who we are and potentially what we believe based on our body art.

  • The problem with using ‘super recognizers’ to spot criminals in a crowd

    People often say that they never forget a face, but for some people, this claim might actually be true. So-called super recognizers are said to possess exceptional face recognition abilities, often remembering the faces of those they have only briefly encountered or haven’t seen for many years. Their unique skills have even caught the attention of policing and security organizations, who have begun using super recognizers to match photographs of suspects or missing persons to blurry CCTV footage. But recent research shows that the methods used to identify super recognizers are limited, and that the people recruited for this work might not always be as super as initially thought.