Passengers leaving the US will have to pass facial recognition scanners at all international airports in the future

Visa holders looking to board international flights out of the US will soon be required to pass a facial recognition test at all US international airports, The Verge reports. Facial recognition systems at airports have been around since 2015 in a handful of airports around the globe. As part of his first 100-day agenda, Donald Trump has expedited a system that will track every outgoing passenger from the US. The system is currently being tested on a flight from Atlanta to Tokyo with wider adoption to come in the summer.

US Customs and Border Protection has access to photos of all visa holders and those who have entered the into US legally. The new system, called Biometric Exit, will scan each passenger’s face before they board their plane out of the US. If they do not appear in the CBP database, that means they may have entered the country illegally. Since photos and other biometric records are kept for all visa holders entering the country, the Biometric Exit system can also be used to determine if a passenger has overstayed their visa.

Congress mandated a Biometric Exit system back in 1996 but it hasn’t gained much attention until now. President Trump’s executive order on immigration from the seven Muslin-majority nations also included a clause meant to speed up the adoption of this Biometric Exit program. The rollout was originally planned for early 2018 but will likely see major installations this year.

The new plan isn’t necessarily being marketed as a security tool since it doesn’t stop anyone from entering the country in the first place. It is just a tool to check their status once they leave. Supporters of the plan see it as a simple verification system that does not hinder transportation for those following the laws. Critics on the other hand are worried about potential bias in the algorithm that could create a higher probability of false positives for African Americans and women. Since the algorithms are trained on faces from mostly white subjects, the results can be 5-10% less accurate for other races.