- Facebook is reportedly bringing in firms like HCL Technologies, Wipro and Tech Mahindra to help moderate content on the social networking platform.
- The move will help increase the value of Facebook contracts with local firms to over $400 million in the Indian market.
- After the leak of Facebook’s ‘secret rulebook’, it’s become all the more important that moderators are able to understand the context of a comment or post before removing it.
Last week Facebook’s ‘secret rulebook’ was leaked and company faced backlash not understanding local sensitivities and laws towards issues that couldn’t be segregated into simple yes-and-no categories. In order to correct that bias, the social networking site is bringing in local firms such as HCL Technologies, Wipro and Tech Mahindra to help them moderate local content.
According to the Economic Times, Facebook has been in talks with Genpact and Accenture as well. The combined contracts of local companies with Facebook is valued at $400 million and will only increase with these new developments coming into play.
Indian laws, moderation and bias
The pressure on Facebook to walk the line has increased with the increasing influence that it has on dictating global political speech — especially in light of the Cambridge Analytica scandal that came to light earlier last year.
The ‘secret rulebook’ heightened the concerns of Indian users with guidelines that instructed moderators to pull down posts on the site that were critical of religion or, in some cases, used the term ‘free Kashmir’.
Misinformation is definitely a beast that Facebook has to deal with, but pushing out mistaken bias — even unintentionally — can have repercussions that fuel the same problem. It also gives Facebook the power of being an arbiter of global speech distilling the freedom of expression.
The report leaked by the New York Times cites that Facebook’s ‘secret rulebook’, addressing many countries around the world, includes diagrams that explain that removing certain content is only done if it has the potential to become a legal challenge or blocked by particular governments.
Having local firms at the helm of understanding what is actually being shared on Facebook will be a welcome to change to address these issues and empower the platform to unbiased, to best of its ability, while moderating posts that may actually prove to be harmful.
Hopefully, the moderators won’t have to rely on Google Translate in order to understand the context of what is written and hence, make an informed decision of what to block and what not to block.