The far-right founder of the English Defence League, Tommy Robinson, has officially been banned from both Facebook and Instagram.
In a blog post, Facebook said Tommy Robinson, whose real name is Stephen Yaxley-Lennon, broke community standards by engaging in “organized hate.”
“Tommy Robinson’s Facebook Page has repeatedly broken these standards, posting material that uses dehumanizing language and calls for violence targeted at Muslims,” Facebook wrote. “He has also behaved in ways that violate our policies around organized hate. As a result, in accordance with our policies, we have removed Tommy Robinson’s official Facebook Page and Instagram profile.”
Facebook went on to say, “This is not a decision we take lightly, but individuals and organizations that attack others on the basis of who they are have no place on Facebook or Instagram.”
Robinson’s Facebook ban comes almost an entire year after Twitter permanently removed his account. Now, Robinson will have to rely on YouTube as the only major platform to reach his audience and maintain an online presence.
Although Robinson’s ban is a win for those who advocated for it, it took a long time to achieve. It exhibits how social media platforms have long struggled with controlling hate speech.
For example, Alex Jones and InfoWars accounts saw a widespread ban on major platforms like Twitter, Facebook, and YouTube last fall, but Jones was able to get around his Facebook ban. After Infowars was banned, Newswars, another site that Jones’ company runs was still up, according to the Washington Post.
Although Facebook cites “organized hate” as a reason for Robinson’s ban, the site continues to struggle with it. Recently, the LA Times found that Facebook decides which users are interested in Nazis and lets ads target them specifically. According to the LA Times, experts say that practice “runs counter to the company’s stated principles and can help fuel radicalization online.”
Oren Segal, director of the Anti-Defamation League’s center on extremism, told the LA Times, “What you’re describing, where a clear hateful idea or narrative can be amplified to reach more people, is exactly what they said they don’t want to do and what they need to be held accountable for.”
Platforms like Facebook moving to remove big names such as Tommy Robinson is one step in ensuring hateful ideas cannot be amplified. At the same time — as demonstrated by their advertisements — Facebook itself has to be accountable for deep structural problems allowing these messages to spread in the first place.