Facebook content moderators in SL paid US$ 4 per hour



A year ago, the headline in the Guardian story I co-wrote read “Sri Lanka blocks social media as deadly violence spreads.” Since the block a lot has been written on the pros and cons of the move.    There has also been a lot more attention on the part of governments and Facebook to quell the use of the platform for the spread of inflammatory content. There have been at least two high profile visits by  Facebook company representatives to the island since the block and Sri Lanka civil society members have taken part regularly in Facebook regional meetings on content moderation.   

A country representative was also appointed as were content moderators for Sinhala content. One of the key points this time last year was Facebook’s inability to effectively moderate content in Sinhala.   

I spoke with several Sri Lankans who did part time work for Facebook as moderators. They were doing the work that had been outsourced to a company based out of Australia and work started sometime in the last quarter of 2018. The moderators were paid US$ 4 per hour and some of them were informed early this year that their services were not needed and they would be informed when there was more work.   


  • High profile Facebook reps visited SL twice since the blockade of the platform
  • BJP and its supporters had spent over Indian Rs. 40 million on Facebook
  • Facebook has also been serious about setting up an independent content monitoring body

The moderators were recruited after they were given web-based training and awareness on Facebook’s guidelines. The successful candidates thereafter sat a test, again web based and were able to secure above 75%. They spent around 1.5 minutes on each post they moderated from what they told me.   

Interestingly, two moderators whom I spoke to said that most of the content they got for moderation were links from gossip sites that were posted on Facebook. The guidelines said that they need not go into the link itself but check if the preview including the picture was within guidelines. The moderation is still at its infancy and it is still unclear whether Facebook has invested sufficient resources to match the scale of the posts.   

The Indian elections are going to be a harbinger for Facebook’s efforts in the region and elsewhere to rein in hate speech. Facebook has ramped up its effort by setting up a task force to stem the use of the platform for election related abuse in India. Last week it started a series of training sessions for journalists on the use of effective and safe use of Facebook and linked platforms in New Delhi.   

The moderators were paid US$ 4 per hour and some of them were informed early this year that their services were not needed and they would be informed when there was more work

By the first week of March, the BJP and its supporters had spent over Indian Rs. 40 million on Facebook marketing, placing over 16,000 ads, Indian media reported last week. The changes that have been effected in the last year make sure that with the advertisements now however users get more details on why it appears on the page and who has placed it.   

Facebook has also been serious about setting up an independent content monitoring body that can override in-house decisions. Vanity Fair in an exhaustive piece that came out earlier this month called it a Facebook’s version of Supreme Court. The platform has been having semi-public discussions on this, one of which took place in New Delhi recently. Other meetings are either planned or have been held in New York, Mexico, Nairobi and Singapore.   

The Content Oversight body is to have 40 members, a three year term and would be paid for by Facebook. “As we build out the board we want to make sure it is able to render independent judgement, is transparent and respects privacy,” Nick Clegg,Facebook Head of Global Affairs said announcing the meetings.   

Facebook has been its usual secretive self in discussing this plan where a major crux point is its independence.   

We are in essence talking about a moderation body that is to serve 2.3 billion, larger than any country, speaking in hundreds of different languages on a million different things, spread across the globe.   

The author is the Asia-Pacific Coordinator for the DART Centre for Journalism and Trauma, a project of the Columbia  Journalism School Twitter - @amanthap



  Comments - 0


You May Also Like