• Interactive Police Line-Ups Improve Eyewitness Accuracy

    Lineups are used around the globe to help police identify criminals. Typically these involve witnesses examining an image of the suspect alongside ‘fillers’ – individuals who look similar, but who weren’t involved in the crime.A new interactive lineup software enables witnesses to rotate and view lineup faces from different angles.Researchers found that witnesseswere more likely to accurately pick out the criminal from the lineup.

  • Face Off for Best ID Checkers

    A face matching test has been updated to find super-recognizers who can help prevent errors caused by face recognition software. The type of professional roles that involve face identification and that could benefit from the test include visa processors, passport issuers, border control officers, police, contract tracers, as well as security staff in private industry.

  • Deployment of Emotion-Recognition Technologies in China Threatens Human Rights

    Emotion recognition is a biometric technology which purports to be able to analyze a person’s inner emotional state. These biometric applications are used by law enforcement authorities to identify suspicious individuals, and by schools to monitor how well a student is paying attention in class. China is deploying the technology to allow the authorities to better monitor forbidden anti-regime thoughts among citizens who are subject to police interrogation or investigation.

  • Coercive Collection of DNA Is Unethical, Damaging to the Future of Medical Research

    The compulsory collection of DNA being undertaken in some parts of the world is not just unethical, but risks affecting people’s willingness to donate biological samples and thus contribute to the advancement of medical knowledge and the development of new treatments, say experts.

  • Face Surveillance and the Capitol Attack

    After last week’s violent attack on the Capitol, law enforcement is working overtime to identify the perpetrators. This is critical to accountability for the attempted insurrection. Law enforcement has many, many tools at their disposal to do this, especially given the very public nature of most of the organizing. But the Electronic Frontier Foundations (EFF) says it objects to one method reportedly being used to determine who was involved: law enforcement using facial recognition technologies to compare photos of unidentified individuals from the Capitol attack to databases of photos of known individuals. “There are just too many risks and problems in this approach, both technically and legally, to justify its use,” the EFF says.

  • Screening Masked Faces at Airports: 96% Accuracy in Recent Test

    A controlled scenario test by the DHS S&T shows promising results for facial recognition technologies to accurately identify individuals wearing protective face masks.

  • Face Masks Change the Way We Process Faces

    Ever want to walk over to say hello to someone but you’re not sure the person behind the mask is in fact someone you know? Researchers say you’re not alone.

  • Identity Verification in the Age of COVID-19

    Face masks have become a way of life due to the COVID-19 pandemic. We now wear them nearly everywhere we go—at grocery stores, on public transportation, in schools, at work—any situation that requires us to be around others. But what about at places that require a higher level of security, like airports?

  • Face Recognition Software Improving in Recognizing Masked Faces

    A new study of face recognition technology created after the onset of the COVID-19 pandemic shows that some software developers have made demonstrable progress at recognizing masked faces.

  • Building Pandemic Preparedness and Resilience to Confront Future Pandemics

    With the current COVID-19 pandemic revealing major gaps in national readiness, the Bipartisan Commission on Biodefense brought together members of the legislative and scientific community for a virtual discussion on the need to increase and optimize resource investments to promote changes in US policy and strengthen national pandemic preparedness and response.

  • Finger Veins-Based 3D Biometric Authentication Almost Impossible to Fool

    Biometric authentication, which uses unique anatomical features such as fingerprints or facial features to verify a person’s identity, is increasingly replacing traditional passwords for accessing everything from smartphones to law enforcement systems. A newly developed approach that uses 3D images of finger veins could greatly increase the security of this type of authentication. Combining light and sound adds depth information that boosts security of biometric authentication.

  • Protecting Yourself against Facial Recognition Software

    The rapid rise of facial recognition systems has placed the technology into many facets of our daily lives, whether we know it or not. What might seem innocuous when Facebook identifies a friend in an uploaded photo grows more ominous in enterprises such as Clearview AI, a private company that trained its facial recognition system on billions of images scraped without consent from social media and the internet. A new research project from the University of Chicago provides a powerful new protection mechanism.

  • Face Masks’ Effect on Face Recognition Software

    Now that so many of us are covering our faces to help reduce the spread of COVID-19, how well do face recognition algorithms identify people wearing masks? The answer, according to a preliminary NIST study), is with great difficulty. Algorithms created before the pandemic generally perform less accurately with digitally masked faces.

  • No Laughing Matter: Laughter Signature as New Biometrics

    The popular view of biometric security often invokes fingerprint readers, iris or retinal scans, and voice-activated systems. Researchers have now demonstrated how the way a person laughs might be used in biometrics. Initial tests of the approach show that a prototype laughter recognition algorithm can be 90 percent accurate.

  • Large-Scale Facial Recognition Is Incompatible with a Free Society

    In the U.S., tireless opposition to state use of facial recognition algorithms has recently won some victories. Outside the U.S., however, the tide is heading in the other direction. To decide whether to expand or limit the use of facial recognition technology, nations will need to answer fundamental questions about the kind of people, and the kind of society, they want to be. Face surveillance is based on morally compromised research, violates our rights, is harmful, and exacerbates structural injustice, both when it works and when it fails. Its adoption harms individuals, and makes our society as a whole more unjust, and less free. A moratorium on its use is the least we should demand.