Facebook is taking a much firmer stance on supporters of the QAnon conspiracy theory. The platform is now removing pages, accounts, and groups pertaining to QAnon across both Facebook and Instagram.

Facebook Takes Down QAnon

QAnon is a far-right conspiracy theory that first emerged in 2017. According to the theory, US President Donald Trump is fighting against a malevolent "deep state" that controls the US government and society as a whole. The group is lead by an anonymous figure, Q, that claims to have access to classified government information.

Facebook started taking action on QAnon and other militarized social movements in August 2020. In total, Facebook managed to take down over 1,500 QAnon-related pages and groups that promoted violence.

But now, Facebook is tightening its restrictions on content with ties to QAnon. In an About Facebook blog post, the platform announced that it will start removing all representations of QAnon across Facebook and Instagram. Facebook noted that the ban includes any QAnon groups, accounts, and pages, "even if they contain no violent content."

To speed up the removal process, Facebook isn't just relying on user reports to get rid of QAnon---it's using its Dangerous Organizations Operations team to identify and eliminate QAnon-related content.

This update comes after Facebook noticed QAnon's negative impact on the network. Facebook has found that the platform doesn't just incite violence, but it also spreads false information that causes "real world harm."

As an example, Facebook stated that QAnon groups wrongly implicated certain groups as the cause of California wildfires, which allegedly interfered with public safety. QAnon also promotes COVID-19 misinformation across the platform, all while claiming to fight against child-trafficking.

As such, Facebook has begun directing users towards credible child safety resources, and has also removed health groups from recommendation lists.

Facebook also found that QAnon's ideas change rapidly, further contributing to confusion, stating:

Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.

Even still, Facebook might not be able to tackle every new QAnon threat that crops up. Facebook says that it "expects renewed attempts" at users trying to reestablish QAnon communities, and that it will update its "policy and enforcement as necessary."

But despite banning all kinds of QAnon content, Facebook is still facing backlash for its response. Some people are criticizing the platform for acting too slow to combat QAnon, as Facebook has allowed QAnon-related groups to exist and grow since 2017.

Was Facebook Too Late Dealing With QAnon?

QAnon has had ample time to built its base, and Facebook didn't do anything about it until now. If QAnon can no longer exist on Facebook, its supporters will likely turn to another social media outlet to spread misinformation and harmful content.