Facebook is working to catch child sexual exploitation content it may have missed as a result of a technical issue after changes the firm made late last year.
Between July and September the tech giant adjusted its detection technology, which allowed it to find and remove a large amount of old violating content, resulting in a spike in enforcement.
In mid-November the company made changes to its media-matching tools and later discovered a technical issue in the implementation.
“When that error was discovered, we fixed it and are in the process of going back to retrospectively remove all of that content that was missed,” said Guy Rosen, Facebook’s vice president of integrity.
Targeted content decreased from 12.4 million pieces in the third quarter, when changes were first introduced, to 5.4 million in the final three months of 2020.
The latest data for child nudity and sexual exploitation is the lowest count since Facebook began publishing progress in mid 2018 in regular Community Standards Enforcement Reports.
The vast majority of offending material is removed by the company’s AI systems before users report it.
Fewer than 0.05% of views are estimated to be of content that violated standards against child nudity and sexual exploitation.
Facebook also revealed that it addressed 200,000 fewer pieces of child sexual content on Instagram compared with the previous quarter, saying “fluctuations in manual review capacity” due to the pandemic were to blame.
The social network has previously said its ability to review content has been impacted by Covid-19 and it is prioritising the most harmful content.
However, Facebook reported improvements in other enforcement areas, particularly bullying and harassment, having reviewed 6.3 million pieces between October and December, up from 3.5 million in the previous three months, due in part to updates in technology used to detect comments.
On Instagram, the number the firm looked at almost doubled from 2.6 million to five million.
Prevalence of hate speech on Facebook fell to about seven or eight for every 10,000 views of content, while violent and graphic content dropped from 0.07% to 0.05% and adult nudity from 0.05-0.06% to 0.03-0.04%.
And 6.4 million pieces of organised hate content was inspected, up from four million.