Across the United States, local governments have held discussions about facial recognition. Last week, San Francisco banned government use of the technology, while cities like Oakland, California and Somerville, Massachusetts are exploring doing the same.

Each of those cities began looking closely at facial recognition due to the danger it poses to Black and brown communities. San Francisco’s own bill stated, “The propensity for facial recognition technology to endanger civil rights and civil liberties substantially outweighs its purported benefits, and the technology will exacerbate racial injustice and threaten our ability to live free of continuous government monitoring.”

It’s important that local communities are starting conversations about facial recognition tech and the harms that come with it, but it also needs to occur at a higher level. After all, the risks that civil rights and privacy advocates highlight around facial recognition includes continuous, mass surveillance and violation of people’s constitutional rights.

Today, the House Oversight and Reform Committee held what may be the first in a series of hearings on facial recognition technology. The hearing examined not only the use of facial recognition by government and commercial entities, but also the need for oversight. Many of the witnesses present called on Congress to put a moratorium on the use of facial recognition.

Neema Singh Guliani, Senior Legislative Counsel for the American Civil Liberties Union, told the Committee, “The ACLU urges Congress to take action to prevent federal agencies, including the FBI, from using face recognition for criminal and immigration enforcement purposes until Congress fully debates and passes legislation dictating what, if any, uses are permissible.”

Part of the issue is that there’s no federal regulations around facial recognition technology. That means there’s no standards for these programs, which can lead to improper use. Clare Garvie, a Senior Associate for Georgetown University’s Center on Privacy & Technology, testified that police can insert “garbage data” into algorithms to force matches.

For example, police in Washington County, Oregon are running artist sketches through Amazon’s Rekognition to get matches. A Georgetown report authored by Garvie revealed that the New York Police Department uses altered images to get matches.

In addition, facial recognition has been found to be inaccurate, especially when it comes to women and people of color. Joy Buolamwini, the Founder of the Algorithmic Justice League, testified on her research of facial recognition at M.I.T. Media Lab.

“In one test, Amazon recognition even failed on the face of Oprah Winfrey labeling her male,” Buolamwini said. “Personally, I’ve had to resort to literally wearing a white mask to have my face detected by some of this technology. Coding in white face is the last thing I expected to be doing at M.I.T., an American epicenter of innovation.”

During the hearing, Rep. Alexandria Ocasio-Cortez asked Buolamwini directly who facial recognition programs are most effective on. Buolamwini responded white men. And when asked who primarily designs the programs, Buolamwini said, “Definitely white men.”

Ocasio-Cortez later tweeted, “When tech companies develop algorithms that automate the assumptions of people from one demo, they start to automate subconscious bias. When those flawed algorithms get sold to law enforcement, it can be disastrous for anyone that doesn’t look like most Silicon Valley engineers.”

However, the core problem extends beyond inaccurate algorithms and their inability to read anyone who isn’t white. Guliani noted that, even if the technology was accurate, “it is more likely to be used against communities of color, which are disproportionately subject to over policing, including increased stops, arrests, and uses of force.”

Due to disparities resulting from a history of anti-Black policing, Guliani also testified that Black people would be overrepresented in the mugshot photos that some facial recognition systems use to scan for matches.

Despite the lack of regulation, facial recognition has been deployed across the country. The technology has developed to the point where cities like Detroit and Chicago can now access “real-time” face surveillance.

Advocates can push local governments to protect vulnerable communities, but the federal government needs to take a position, as well.