DHS’s airport biometric exit program faces budgetary, legal, technical, and privacy questions

Making matters worse, the face scanning technology used by DHS may make frequent mistakes. According to DHS’ own data, DHS’ face recognition systems erroneously reject as many as 1 in 25 travelers using valid credentials. At this high rate, DHS’s error-prone face scanning system could cause 1,632 passengers to be wrongfully delayed or denied boarding every day at New York’s John F. Kennedy (JFK) International Airport alone. “What’s more, DHS does not appear to have any sense of how effective its system will be at actually catching impostors—the system’s primary goal,” the report says.

The privacy concerns implicated by biometric exit are at least as troubling as the system’s legal and technical problems. As currently envisioned, the program represents a serious escalation of biometric scanning of Americans, and there are no codified rules that constrain it. It may also lead to an even greater and more privacy-invasive government surveillance system. In addition, the program may hasten the development and deployment of privacy-invasive commercial technology by the airlines and technology vendors participating in biometric exit. 

For now, DHS is moving forward with this expensive program, “but Americans should consider whether it would be wiser to abandon DHS’ airport face scan program and invest those funds elsewhere,” the report says. If the program is to proceed, however, then at a minimum:

· DHS should justify its investment in face scans by supplying evidence of the problem it purportedly solves. 

· DHS should stop scanning travelers’ faces until it has completed a federally required rulemaking proceeding. 

· DHS should stop scanning the faces of American citizens as they leave the country. 

· DHS should prove that airport face scans are capable of identifying impostors without inconveniencing everyone else. 

· DHS should adopt a public policy that prohibits secondary uses of the data collected by its airport face scan program. 

· DHS should provide fairness and privacy guarantees to the airlines with which it partners. 

The report says that in addition, in service to their customers, airlines should not partner with DHS in the future to conduct biometric screening of their passengers without first ensuring that DHS does all of the above, and without obtaining transparent and enforceable privacy, accuracy, and anti-bias guarantees from DHS.

What is biometric exit?

“Biometric exit” is a program that DHS operates that uses biometric data—data about a person’s body—to verify travelers’ identities as they leave the country. DHS initially tried to use fingerprint-based verification systems. These, however, disrupted traveler flow through airport terminals and were time-consuming and labor-intensive for administering personnel. Under the latest iteration of the program, DHS has turned to face recognition.

Prior to departure of an outbound international flight, DHS prepopulates its “Traveler Verification Service” (TVS) with biometric templates of all travelers expected on the flight. Upon reaching the airport gate to board the plane, each traveler is then asked to stand for a photo in front of a camera. The camera transmits the traveler’s in-person photo to TVS, which compares it against the biometric templates on file. TVS then either confirms that the traveler’s face matches the on-file biometric template(s) for an expected traveler—and creates a “biometrically verified exit record” for her—or rejects the traveler’s face as a “non-match.”

If the traveler is rejected by the system, her credentials will be checked manually by a Customs and Border Protection (CBP) agent, or she will be subjected to another biometric check, such as a fingerprint comparison.

Understanding face recognition mistakes

Face recognition technology is not perfect. If it were, it would always correctly accept—and clear for boarding—each traveler using her own valid credentials (a “True Accept”) and would always correctly reject any traveler using fraudulent credentials, for example, someone else’s identification documents (a “True Reject”).

In reality, face-recognition systems make mistakes on both of those fronts. In the case of a “False Reject,” a system fails to match an airport photo of a traveler’s face to the photo on the traveler’s own valid identification documents. A system may mistakenly reject a traveler flying under his own identity, for example, because his photo on file was taken four years prior and he has changed appearance since then.

In contrast, in the case of a “False Accept,” the system mistakenly matches an airport photo of a traveler’s face to someone else’s photo. For example, a face scanning system may mistakenly accept an impostor fraudulently presenting someone else’s travel credentials as her own.

— Read more in Harrison Rudolph et al, Not Ready for Takeoff: Face Scans at Airport Departure Gates (Georgetown Law Center on Privacy & Technology, 21 December 2017)