Facebook executives have downplayed reports that its content review policy has been comprised by commercial interests in response to the broadcast of a new documentary that alleges just that.

An investigation by British broadcaster Channel Four claims Facebook has intentionally left toxic content on the platform because its popularity generates the social media giant revenue.

A journalist went undercover in Ireland and worked for CPL Resources, a contractor to whom Facebook outsources some of its content moderation. The investigation reportedly uncovered a practice of “shielded review” where the discretion of Facebook employees allowed content from far-right groups that would typically result in the pages being deleted almost immediately to remain online, despite it being flagged for review.

In the documentary, Inside Facebook: Secrets of a Social Network, a content moderator tells the undercover reporter the pages for right-wing group Britain first were left up despite repeatedly breaking Facebook rules because “they have a lot of followers so they’re generating a lot of revenue for Facebook”.

The investigation also alleges that content moderators were trained to ignore visual evidence that a Facebook user was under the age of 13. Facebook’s policy requires users to be at least 13 years old.

Facebook has denied the platform’s revenue system incentivises allowing underage users or delaying the removal of toxic content.

“It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true,” said Facebook Vice President of Global Policy Management, Monika Bickert.

However, she conceded the allegations do raise questions over the social media giant’s review policies, which were recently updated.

“This week a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dubai,” Bickert said in a company blog post.

“It’s clear that some of what is in the program does not reflect Facebook’s policies or values and falls short of the high standards we expect.

“We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention.”

Facebook updated response

Following the airing of the Channel Four investigation, Bickert updated her post to categorically deny any particular extreme group had been protected and that the “cross-checking” that kept offending pages live was a necessary review step.

“We want to make clear that we remove content from Facebook, no matter who posts it, when it violates our standards. There are no special protections for any group — whether on the right or the left.”

“Britain First was a cross-checked Page. But the notion this in anyway protected the content is wrong. In fact, we removed Britain First from Facebook in March because their Pages repeatedly violated our Community Standards.”

Previous post

How Financial Services Marketers can Execute Faster with Smarter Resource Management

Next post

Former DTA CEO Slams My Health Record Roll Out

Join the digital transformation discussion and sign up for the Which-50 Irregular Insights newsletter.