Skip to main content

U.S. Customs to use facial recognition software to photograph, identify travelers (and you can't say no)


My passport picture is terrible — the photographer somehow captured me mid-blink and flashing my most rabid-looking smile. So when I'm returning to the United States, getting eyeballed by a Border Control agent, I try to lower one eyelid so I look just as unappealing in real life (it's not hard to do). But apparently getting the once-over by an agent isn't enough for Customs and Border Protection. They are quietly testing out a pilot program that takes pictures of travelers and uses facial recognition software to match them with their passport photographs.

According to documents acquired by Vice's Motherboard, the pilot program, cleverly called the 1:1 Facial Recognition Air Entry Pilot, is currently being tested at Washington, D.C.'s Dulles International Airport and will last 19 months, although the CBP will only be snapping photographs for the first 90 days. Travelers will be selected at random and photographed (and those chosen cannot opt out) and those snapshots will be compared with the picture stored on the chip embedded inside their passports. If the "match confidence score" is low – meaning the person might be using a false identity or a passport that isn't theirs – the officers can "take secondary action."

The CBP says that the photographs will only be shared with the Department of Homeland Security and that the pictures will be deleted after the pilot program ends in 19 months. In addition, the photographs will only be tagged with the time and date that they were snapped and not connected to the travelers' names or other identifying details. A CBP spokesperson told Motherboard

The technology is a stand-alone system and will not communicate with any other parties, databases or systems. CBP remains committed to protecting the privacy of all travelers.

Despite those claims, civil liberties activists shared their concerns with Motherboard, worried that there could be privacy risks or – worse – the potential for the government to start collecting information on citizens who are not breaking (and have not broken) the law.  Jake Laperruque from the Center for Democracy and Technology said:

Here we have a program where individuals are not suspected of wrongdoing and are engaged in routine behavior and they are being required to submit a piece of biometric data that could identify them later and that’s going to be retained. That’s definitely a dark road to be going down with a lot of potential for abuse.

So say "Cheese," I guess. You don't have a choice.