Calendar An icon of a desk calendar. Cancel An icon of a circle with a diagonal line across. Caret An icon of a block arrow pointing to the right. Email An icon of a paper envelope. Facebook An icon of the Facebook "f" mark. Google An icon of the Google "G" mark. Linked In An icon of the Linked In "in" mark. Logout An icon representing logout. Profile An icon that resembles human head and shoulders. Telephone An icon of a traditional telephone receiver. Tick An icon of a tick mark. Is Public An icon of a human eye and eyelashes. Is Not Public An icon of a human eye and eyelashes with a diagonal line through it. Pause Icon A two-lined pause icon for stopping interactions. Quote Mark A opening quote mark. Quote Mark A closing quote mark. Arrow An icon of an arrow. Folder An icon of a paper folder. Breaking An icon of an exclamation mark on a circular background. Camera An icon of a digital camera. Caret An icon of a caret arrow. Clock An icon of a clock face. Close An icon of the an X shape. Close Icon An icon used to represent where to interact to collapse or dismiss a component Comment An icon of a speech bubble. Comments An icon of a speech bubble, denoting user comments. Comments An icon of a speech bubble, denoting user comments. Ellipsis An icon of 3 horizontal dots. Envelope An icon of a paper envelope. Facebook An icon of a facebook f logo. Camera An icon of a digital camera. Home An icon of a house. Instagram An icon of the Instagram logo. LinkedIn An icon of the LinkedIn logo. Magnifying Glass An icon of a magnifying glass. Search Icon A magnifying glass icon that is used to represent the function of searching. Menu An icon of 3 horizontal lines. Hamburger Menu Icon An icon used to represent a collapsed menu. Next An icon of an arrow pointing to the right. Notice An explanation mark centred inside a circle. Previous An icon of an arrow pointing to the left. Rating An icon of a star. Tag An icon of a tag. Twitter An icon of the Twitter logo. Video Camera An icon of a video camera shape. Speech Bubble Icon A icon displaying a speech bubble WhatsApp An icon of the WhatsApp logo. Information An icon of an information logo. Plus A mathematical 'plus' symbol. Duration An icon indicating Time. Success Tick An icon of a green tick. Success Tick Timeout An icon of a greyed out success tick. Loading Spinner An icon of a loading spinner. Facebook Messenger An icon of the facebook messenger app logo. Facebook An icon of a facebook f logo. Facebook Messenger An icon of the Twitter app logo. LinkedIn An icon of the LinkedIn logo. WhatsApp Messenger An icon of the Whatsapp messenger app logo. Email An icon of an mail envelope. Copy link A decentered black square over a white square.

Facebook launches investigation into moderation of disturbing content

An undercover reporter worked at Cpl Resources in Dublin, Facebook’s largest centre for Ireland and UK content (Dominic Lipinski/PA)
An undercover reporter worked at Cpl Resources in Dublin, Facebook’s largest centre for Ireland and UK content (Dominic Lipinski/PA)

Facebook has launched an investigation into a contractor after it instructed its moderators not to remove extreme, abusive or graphic content from the platform.

Company executives said they will review and change its processes and policies to make the social media site more safe and secure.

Niamh Sweeney, Facebook Ireland head of public policy, told a parliamentary committee that suggestions the tech giant turns a blind eye to disturbing content were “categorically untrue”.

It came as Facebook apologised for allowing disturbing content to remain on its site.

Ms Sweeney and Siobhan Cummiskey, head of content policy of Europe, the Middle East and Africa, appeared before the Irish Communications Committee to answer questions about its content moderation policy of violent and harmful content.

A documentary on Britain’s Channel 4’s The Dispatches programme, Inside Facebook, used hidden-camera footage to show how content moderation practices are taught and applied within the company’s operations in Dublin.

An undercover reporter worked at Cpl Resources in Dublin, Facebook’s largest centre for Ireland and UK content.

It emerged in the investigation that Facebook moderators were instructed not to remove extreme, abusive or graphic content from the platform even when it violated the company’s guidelines.

Niamh Sweeney (left) and Siobhan Cummiskey
Facebook executives, Niamh Sweeney (left) and Siobhan Cummiskey appeared before the Irish Communications Committee (Oireachtas TV/PA)

This included violent videos involving assaults on children, racially charged hate speech and images of self-harm among underage users.

Ms Sweeney told the committee she and her colleagues were “upset” by what was reported in the programme.

“If our services are not safe, people won’t share content with each other, and, over time, would stop using them,” she said.

“Nor do advertisers want their brands associated with disturbing or problematic content, and advertising is Facebook’s main source of revenue.”

For six weeks the undercover reporter attended training sessions and filmed conversations in the offices.

One video showed a man punching or stamping on a screaming toddler.

The moderators marked it as disturbing and allowed it to remain online and used it as an example of acceptable content.

Ms Sweeney admitted disturbing content of violent assaults and racially-charged hate speeches that were allowed to remain on its platform was a betrayal of Facebook’s own standards.

She said the social media giant was not aware a video of a young toddler being assaulted by an adult was being used as an example to the type of content that was allowed to remain on its site.

“We understand that what I am saying to you has been undermined by the comments captured on camera by the Dispatches reporter,” Ms Sweeney said.

“We are in the process of an internal investigation to understand why some actions taken by Cpl were not reflective of our policies and the underlying values on which they are based.”

She told the committee that Facebook is investing heavily in new technology to help deal with disturbing content.

Ms Sweeney also said the guidance given by trainers to its moderators were incorrect.

The committee heard that while the decision not to remove the video of the three-year-old was a mistake, there are a “narrow set of circumstances” in which Facebook would allow the video to be shared.

She explained this would happen if the child was still at risk, and there is a chance the child and perpetrator could be identified to law enforcement.

However chair of the committee hearing, Hildegarde Naughton TD, said it was not acceptable that Facebook would be the “sole arbitraries” in relation to what can and cannot remain online.

She said: “To say they will leave harmful, abusive and illegal material online to find the perpetrator is not acceptable and not acceptable to the people who see it online.

“It’s up to the law enforcement agencies and up to Facebook to contact the relevant child protection agency, all of those procedures need to be set up.”

After the programme was aired, Facebook said it has made changes to its processes and policies.

This includes flagging users who are suspected of being under the age of 13 and putting their accounts on hold while their policy on removing child abuse videos is also under review.

Ms Sweeney went on to say it is carrying out an internal investigation with Cpl to establish how the “gaps between our policies and values” and the training Cpl happened.

The staff at the centre of the Dispatches show were also “encouraged” to take time off.

Other steps taken by the tech giant include retraining Cpl trainers, revising their training materials, introducing new quality control measures and holding an audit to identify any repeat failings by Cpl over the last six months.

Ms Sweeney added that while a number of issues were raised in the Channel 4 programme, their system was “working”.

She continued: “I wouldn’t like to say that the system is broken entirely. We have had some amount of success in regulating areas.”

She said their most effective has been in child sexual exploitation.

“The vast majority people on Facebook don’t use it in the way we are listening to today,” she added.

“They don’t encounter the type of people we are discussing.”