On April Fools’ Day, r/Games announced it would shut down to shed light on bigotry in the online gaming community.

With 1.7 million subscribers, the subreddit’s closure may have seemed like a joke. However, the moderating team laid out their decision to “take things a little more seriously and shed some light on a growing, pervasive issue that has affected the community.”

The moderators wrote:

Though certain memes (such as “gamers rise up”) surrounding gaming are largely viewed as a humorous interpretation of a mindset, at the core of the humor is a set of very serious issues that affect all gaming enthusiasts. By showing disdain or outright rejecting minority and marginalized communities, we become more insular. In this, we lose out on the chance to not only show compassion to these people, but also the chance to grow our own community and diversify the demographics of those involved in it. Whether it’s misogyny, transphobia, homophobia, racism or a host of other discriminatory practices, now is the time to stymie the flow of regressive ideas and prevent them from ever becoming the norm.

According to the moderators, such content isn’t an “infrequent occurrence” and occurs on a daily basis. They went on to highlight specific moments they’d seen regarding transphobia, homophobia, Islamophobia, racism, and other sorts of discriminatory practices.

The moderators closed out their post with a list of charities including LGBT+, POC-focused, and women’s health, writing, “These folks have made it their mission to represent and benefit those who still face their own challenges, obstacles and prejudices, and any assistance they can get is another step forward for their cause.”

Later on, the moderators made a second post saying they decided to close on April Fool’s Day specifically because it typically sees higher traffic at that time. They also doubted any major news would drop, so it’d give “more leeway in shutting down the subreddit for the day.”

Bigotry in online gaming communities has been an issue for years. It was further documented after the Christchurch shooting when Valve — a gaming platform — had to remove over 100 profiles praising the shooter.

Of course it will take more than this to confront the ways online gaming communities foster bigotry, but acknowledging that the problem exists is a start.