Sometimes, you can easily spot a deepfake. They might have a weird delay in sound or something that’s just off about their facial movements. But, as deepfakes become more realistic and accessible, it opens up potential problems.
Now, members of Congress want to step in before matters get worse. Representative Yvette Clarke (D-NY) has introduced the DEEPFAKES Accountability Act in the House, with the aim of combatting “the spread of disinformation through restrictions on deep-fake video alteration technology.”
The bill comes with the rise of the new technology and as the proliferation of internet content impacts the thoughts of millions.
Recently, a deepfake of Facebook CEO Mark Zuckerberg made rounds on Instagram. The video’s caption says, “Mark Zuckerberg reveals the truth about Facebook and who really owns the future.”
The deepfake tested Facebook’s previous decision to keep up an altered video of Nancy Pelosi. Many were curious to see if Facebook would remove the Zuckerberg deepfake, but the company ultimately did not.
“We will treat this content the same way we treat all misinformation on Instagram. If third-party fact-checkers mark it as false, we will filter it from Instagram’s recommendation surfaces like Explore and hashtag pages,” a company spokesperson told The Verge.
Facebook has been under fire for its inability to control the spread of fake news on its platforms. Despite its efforts, getting rid of misinformation and fake accounts has been a big game of whack-a-mole.
It makes sense that the House is jumping in to try putting a stop deepfakes, since the technology could be dangerous around election time. However, the House’s current proposal may not be enforceable.
With election season quickly approaching, it’s important to limit the ways misinformation can spread. Both Congress and social media platforms will have to find policies that cover deepfakes before they do even more damage.