Privacy experts and civil society are calling on the Indian government to seek more algorithmic accountability from Facebook in light of the recent revelations around how the social media giant’s algorithms are designed to fuel hate speech, misinformation and inflammatory posts, particularly anti-Muslim content in India.
They are also seeking the government to urgently set up an independent commission which can deliberate on possible legislation around the structure of social media and its regulation.
Experts are of the view that the government needs to ask for transparency and accountability over how algorithms along with content moderation policies of large social media companies are designed and should frame strict rules over what they can or cannot amplify. Since India is the largest market for companies like Facebook, greater corporate governance needs to be demanded on how they build products for India and what safeguards are in place, they said.
“Every time a Facebook scandal breaks, the conversation gets stuck over the need for more government regulation on social media versus the need for free speech over the Internet. Instead, we need to focus on what content is amplified by the platform, and demand greater algorithmic transparency and accountability — algorithms on the platform are optimised to maximise user engagement, not safety, and end up magnifying the worst of humankind,” said Urvashi Aneja, director, Digital Futures Lab.
Facebook, which owns instant messaging platform WhatsApp and photo sharing app Instagram, has been under fire after a whistle-blower made public a series of documents, now dubbed as ‘Facebook Files’, that accuse the technology giant of putting profit ahead of user safety including that of children, along with fuelling fake news and hate speech through its platform. In the latest expose, the whistle-blower who has identified herself as former Facebook product manager Frances Haugen has revealed that in February 2019, Facebook set up a test account in India to determine how its own algorithms work. The test, which shocked even the company staffers, showed that within three weeks, the new user’s feed was flooded with fake news and provoking images including those of beheadings, doctored images of India air strikes against Pakistan and bloodied scenes of violence.
Parminder Jeet Singh, executive director of think tank IT for Change, said he is not surprised by the revelations since Facebook is a private company and its model is to maximise revenue through more engagement. “We can’t blame them, they have to be made responsible by common templates, their content moderation policies have to be made public and submitted to a regulator for review. There has to be an oversight board which is composed of outsiders and not just company-appointed officials. It ultimately boils down to algorithmic transparency,” he said.
In a statement, a Facebook spokesperson said the exploratory effort of one hypothetical test account led to deeper analysis of its recommendation systems, and contributed to product changes to improve them. “Product changes from subsequent, more rigorous research included things like the removal of borderline content and civic and political groups from our recommendation systems. Separately, our work on curbing hate speech continues and we have further strengthened our hate classifiers, to include four Indian languages,” the spokesperson said.
Singh said the long-term solution to break the monopoly and power exercised by a few social media companies is to create “interoperable social media” which means a common platform where feeds from all major social media platforms can flow in just like an email service provider allows emails from all email hosting services, big or small. “If we create a plural set of interoperable media, it will break their structure and make it competitive and will even resolve some of the issues such as those faced by traditional media companies over losing revenue to social media,” he said.
Singh added that the Indian government should set up a commission of three to five people consisting of eminent retired judges, bureaucrats and civil society members who can transparently debate over the structure of future social media and its regulation.
ET has reported previously that the Indian government is mulling a “rethink” of the safe harbour framework which is enjoyed by the social media platforms and thinks that the “blanket exemption” given to companies has to go in order to address these mounting issues.
To meaningfully address the problem, we need to move from addressing the symptoms to the causes and fix the business model, Aneja of Digital Futures Lab said. “Two ways to address the issue could be to ban political advertising and personalised behavioural advertising on platforms like Facebook.”
“It is also a corporate governance issue since Facebook has the largest users in India and yet most of it is controlled by Silicon Valley. We need better disclosures since the rot of Facebook is being exposed and there doesn’t seem to be any genuine commitment being shown from the company to solve the issue,” said Aneja.