Federal researchers complete second round of problematic tattoo recognition experiments

Roman"”>A database of images captured from incarcerated people was provided to third parties—including private corporations and academic institutions—with little regard for the privacy implications. After EFF called out NIST, the agency retroactively altered its presentations and reports, including eliminating problematic information and replacing images of inmate tattoos in a “Best Practices” poster with topless photos of a researcher with marker drawn all over his body. The agency also pledged to implement new oversight procedures.

However, transparency is lacking. Last November, EFF filed suit against NIST and the FBI after the agencies failed to provide records in response to our Freedom of Information Act requests. So far the records we have freed have revealed how the FBI is seeking to develop a mobile app that can recognize the meaning of tattoos and the absurd methods NIST use to adjust its “Best Practices” documents.  Yet, our lawsuit continues, as the agency continues to withhold records and redacted much of the documents they have produced.

Tatt-E was the latest set of experiments conducted by NIST. Unlike Tatt-C, which involved 19 entities, only two entities chose to participate in Tatt-E, each of which has foreign ties. Both the Chinese Academy of Sciences and MorphoTrak submitted six algorithms for testing against a dataset of tattoo images provided by the Michigan State Police and the Pinellas County Sheriff’s Office in Florida.

MorphoTrak’s algorithms significantly outperformed the Chinese Academy of Sciences’, which may not be surprising since the company’s software has been used with the Michigan State Police’s tattoo database for more than eight years. Its best algorithm could return a positive match within the first 10 images 72.1% of the time, and that number climbed to 84.8 percent if researchers cropped the source image down to just the tattoo. The accuracy in the first 10 images increased to 95 percent if they used the infrared spectrum. In addition, the CAC algorithms performed poorly with tattoos on dark skin, although the skin tone did not make much of a difference for MorphoTrak’s software.

One of the more concerning flaws in the research is that NIST did not document “false positives.” This is when the software says it has matched two tattoos, but the match turns out to be in error. Although this kind of misidentification has been a perpetual problem with face recognition, the researchers felt that it was not useful to the study. In fact, they suggest that false positives a may have “investigative utility in operations.” While they don’t explain exactly what this use case might be, from other documents produced by NIST we can infer they are likely discussing how similar tattoos on different people could establish connections among their wearers.

While Tatt-E was supposed to be limited to images collected by law enforcement, NIST went a step further and used the Nanyang Technological University Tattoo Database, which was compiled from images taken from Flickr users, for further research. With this dataset, the Chinese Academy of Sciences’ algorithms performed better, hitting at high as 99.3 percent accuracy.

No matter the accuracy in identification, tattoo recognition raises serious concerns for our freedoms. As we’ve already seen, improperly interpreted tattoos have been used to brand people as gang members and fast track them for deportation. EFF urges NIST to make Tatt-E its last experiment with this technology.

Dave Maass is Senior Investigative Researcher at EFF. This article is published courtesy of the Electronic Frontier Foundation (EFF)