From TSA unnecessarily searching Black women’s hair to their profiling of Muslims, airport security can quickly turn from a simple headache into a reminder of your status as an inherent threat. Now, airports are introducing a new, invasive security measure that is raising alarms.

Throughout the United States, the US Customs and Border Protection program known as Biometric Exit is in use at departure gates in 17 airports, as reported by CNET. The program uses facial recognition technology to take pictures of people and “verify their identity.”

The agency says it only holds onto the photos of citizens “until their identities have been verified” and everyone else’s for 14 days. The photos of every non-U.S. citizen are also sent to the Department of Homeland Security’s Automated Biometric Identification System (IDENT), which can store information for 75 years.

By 2021 the system will be used to scan 97 percent of all travelers leaving the country, according to CNET. However, facial recognition is also being tested on cameras throughout airports.

Overall, though, the entire program should raise concerns.

One of the biggest issues with the utilization of facial recognition technology is that it’s not as accurate as people assume. Facial recognition is notoriously terrible at accurately reading Black women, for example. Plus, taking pictures of people’s faces without them knowing is simply invasive.

There are some opt-out measures but, as noted by the Electronic Privacy Information Center, CBP continues to change them. There is no formal procedure in place, after all.

There are also questions around the program’s legality. CBP claims it has the right to collect biometrics, but, according to CNET, the ACLU, EPIC, and the Electronic Frontier Foundation, and others say that no law allows CBP to collect biometric information on US citizens.

Airports are already difficult for pretty much anyone who isn’t white. Muslims, for example, face continued harassment and profiling when passing through security — and this includes Black Muslims. Implementing facial recognition technology that has historically been unable to read Black people poses a unique threat for communities who exist at multiple, vulnerable intersections.

The training of facial recognition programs themselves is also sketchy. New research by Os Keyes, Nikki Stevens, and Jacqueline Wernimont published on Slate revealed that the National Institute of Standards and Technology (NIST), who maintains the Facial Recognition Verification Testing program, has used pictures of victims of child pornography, immigration records, and photos of dead arrestees to train programs.

Not only should this warrant pause in any conversation of facial recognition tech, especially when used by the government, but it’s important to note that those images were used without consent.

When it comes to the Biometrics Exit program and plans to implement facial recognition tech throughout airports, the solution isn’t to train these programs to better recognize Black people or other vulnerable communities. Instead, it becomes a question of whether these government biometric programs should be allowed to exist.