Misinformation continues to be a huge problem online, especially for social media platforms. Previously, Facebook tried tackling it by introducing rules against groups and ads that deliberately spread wrong information about vaccines and political opponents. Unfortunately, that wasn’t enough.

Now, Facebook has introduced updates that include harsher punishments for groups and individuals spreading misinformation across its platforms.

In its blog post, Facebook said it plans to reduce the reach of groups that “repeatedly share misinformation”. The company will also add a “click gap” feature to the newsfeed to make sure people see less low-quality content.

Lastly, Facebook will partner with the Associated Press, who will serve as a third-party fact-checker. Any content approved by the AP will be expanded on the platform.

The moves come as Facebook gears up for another turbulent election season, finds new ways to combat vaccine misinformation, and continues to ban pages associated with conspiracy theorist Alex Jones.

Facebook’s election issues have been well documented. In November 2018, the company established a “war room” to delete misinformation and stop election interference posted on the platform for the 2018 midterm elections.

However, in December 2018, the NAACP ended up protesting Facebook and all of its properties after it was revealed that Russian interference efforts disproportionately impacted African American users.

To combat anti-vaxxers, Facebook pulled ads that have vaccine misinformation and disabled the accounts of repeated violators. In addition, groups and pages that promote anti-vaxxer talking points are subject to reduced rankings and removed from suggested content and searches.

Facebook is having to take all these extra steps, because the solution isn’t as simple as deleting pages, because people will just make duplicates. Back in February, Facebook had to delete 22 additional pages associated with Alex Jones, who routinely boasts conspiracy theories — like calling the Sandy Hook shooting and 9/11 hoaxes.

Facebook critics say that the company is moving too slow to fight misinformation; however, the company’s latest announcement reflects a larger stride to reduce harm and correct its previous blunders.