Why EC needs to come clean on its working arrangement with FB

FB appeared not adequately staffed to carry out effective moderation – PAFFREL/Hashtag Generation


https://www.dailymirror.lk/author//     Follow

 

In the run up to last year’s Presidential Elections, Facebook visibly increased its interactions in Sri Lanka. Officials from the social media platform were in the island meeting with government officials as well as those working outside of government.   


Several officials who attended these meetings said that two traits stood out. There was no denying that Facebook wanted to be more active on Sri Lanka. Equally clear was the desire for control. Facebook representatives sought to control how much of their involvement would be detailed in public.   


There was a lot of concern about what can be reported and what cannot be reported outside of these meetings on what was discussed. So much so that when Facebook had awareness programmes for journalists the attendees were informed that the contents were off-the-record.   

 

There was a lot of concern about what can be reported and what cannot be reported outside of these meetings on what was discussed. So much so that when Facebook had awareness programmes for journalists the attendees were informed that the contents were off-the-record 


Several Sri Lankan organisations had by then begun monitoring and debunking hate speech and other problematic content online. As the elections approached, Facebook and the Election Commission had devised a modus operandi for some of them. The organisations would monitor content, compile reports and using reputed election monitoring organisations like PAFFREL as conduits would submit these reports to the EC. The EC would then submit them to Facebook. Till this point we know what is going on . 

 
It is after the EC submits these reports to Facebook that we have no idea what happens or how anything happens at all. There is no written agreement between Facebook and the EC. If there is, I am yet to speak to anyone who has seen that or heard details therein. The working arrangement is verbal.   


Which in turn leads to the question what are the terms of reference for this monitoring work and eventual debunking mechanism? Facebook as a norm prefers to keep its operational data locked. But the question mark here is over how the EC is also adopting a similar approach.  


It has so far remained silent on the working arrangement with Facebook. And no one has questioned either of them. Sri Lankan media has remained by and large accommodative towards Facebook.   


Officials from a Sri Lankan organisation which was monitoring problematic posts on Facebook during the last elections said that when Facebook officials met him and his colleagues, they did not indicate too much of an attention on national media.   


They were more concerned about details of hate speech and fakes on Facebook and the organisations efforts to curb them reaching two other entities though: international election monitors, especially those from the European Union who were in Sri Lanka at that time and the international media.   


The official told me that his organisation too had requested details of take-downs of posts and received none.   
The same operational arrangement is taking place right now. Someone who had access to it told me that reports on problematic posts are relayed to Facebook with the supervision of an assistant commissioner.   


The official told me that the take-downs were taking place rapidly. But we have no idea of the length and depth of this engagement because neither Facebook nor the EC has deemed it important that rest of Sri Lanka should know about this. In this context, a letter that was written jointly by national election monitoring body PAFRREL and the Hashtag Generation gives bit more perspective.   


“We do not believe that there was adequate staffing to ensure that such a review process was conducted in a timely manner. Evidence of this was that some reports took over 60 days to receive a response through Facebook’s trusted partner escalation channel” the letter said.   


Of the 828 posts reported to Facebook between October 7 and November 13 last year, only 90 or 10.96% were taken down.   


Even then by the time some of the take-downs came into effect, the post had remained online for days. In one instance an email that Hashtag Generation sent to Facebook at 8.45 pm on October 22, received a reply 65days later that it had been taken down at 10.14 pm on December 27.   


By the time the reply came, the elections had been held a good month and three weeks before. 


Author is a Post-grad Researcher at CQ University, Melbourne focusing on online journalism and trauma
Twitter - @amanthap



  Comments - 0


You May Also Like