Facebook has finally had it with the crazy cracker tinfoil hat brigade. The company announced this morning (AEST) that it will remove any Facebook Pages, Groups and Instagram accounts representing QAnon. The policy applies even if those pages contain no violent content. 

Today’s news represents action on an update from August that was directed at potential violence, and which imposed a series of restrictions to limit the reach of other Pages, Groups and Instagram accounts associated with the movement. 

In the month after that announcement Facebook says it removed over 1500 Pages and Groups for QAnon containing discussions of potential violence, and over 6500 Pages and Groups tied to more than 300 Militarised Social Movements. 

Now the company is taking the extra step to silence the conspiracy movement altogether on its platforms. 

“We are starting to enforce this updated policy today and are removing content accordingly, but this work will take time and need to continue in the coming days and weeks. Our Dangerous Organisations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”

Facebook describes its Dangerous Organisations Operations team as specialists who study and respond to new evolutions in violating content from this movement, and says its internal detection has provided better leads in identifying new evolutions in violating content than sifting through user reports.

“We’ve been vigilant in enforcing our policy and studying its impact on the platform, but we’ve seen several issues that led to today’s update. For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real-world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”

 Additionally, according to a Facebook post this morning, “QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”

Finally, the company also notes that this is not the first update to this policy.

“We began directing people to credible child safety resources when they search for certain child safety hashtags last week — and we continue to work with external experts to address QAnon supporters using the issue of child safety to recruit and organise. We expect renewed attempts to evade our detection, both in behaviour and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary.”

LinkedIn
Previous post

Australian programmatic media quality among world's best: Report

Next post

Video: Investing in digital capabilities