Social media platforms and digital content publishers are the protagonists of an unexpected regulatory blockbuster that hit the screens in India on February 25. In keeping with the Narendra Modi administration’s penchant for surprise announcements, Union Ministers R S Prasad and Prakash Javadekar announced a regime for India’s internet intermediaries and digital media: The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021. Some changes to our current legal landscape were inevitable, given the negative impact of social media and digital platforms on recent events, ranging from a celebrity’s suicide to a young person’s environmental activism. But the government’s rules in reaction to these and other events raise many questions.
A key issue that is raising eyebrows is the use of government powers to regulate intermediaries under the Information Technology Act 2000 (IT Act) to create rules for publishers of content. Under the new rules, publishers of online “news and current affairs content” and “online curated content” will be subject to a code of ethics, a redress and content-takedown mechanism and an oversight framework. This raises the big question of whether such publishers can be regulated in ways akin to “intermediaries”.
As background, Sections 79 and 87 of IT Act, under which these new rules are passed, relate to rules around the exemption of liability for intermediaries — the so-called intermediary “safe harbour” — for unlawful content on their platforms shared by third parties. The logic of this safe harbour is based on the fact that intermediaries have no control — they do not create, modify or control the relevant content or information on their platforms — but merely enable sharing of content. For instance, a communication app (like Whatsapp or Signal) does not select or modify the messages that you send to a friend, and likewise, peer streaming platforms (like YouTube or Vimeo) provide a platform on which people can share videos directly. Since these intermediaries do not create content, they are protected from liability when unlawful content is shared on their platforms as long as they are undertaking due diligence to keep the platform safe (such as reviewing and taking down flagged content, where appropriate).
No doubt serious suspicions have been raised in recent months regarding the ability of intermediaries to selectively highlight or bury content. But it appears hard to justify regulating publishers (who create content like written publications, podcasts, videos or audio content) using the power to regulate intermediaries.
Online digital news sources and content producers have created new spaces in India for creativity and free expression. Recently, however, we have also seen the rise of outfits that generate “alternative facts” and realities that often polarise and vitiate public debate. The questions and tensions these developments raise are manifestations of timeless debates around the freedoms and boundaries that we choose as a country for free speech and censorship. While no one would disagree that some codes of ethics or rules are necessary to combat misinformation, fake news or propaganda online, the regulation of publishers of original content raises questions around policing speech and expression that cannot simply be nailed down, hammered away and brushed under the carpet of intermediary regulations. We need something longer-lasting and more considered to act as a reasonable restriction on creative speech and artistic expression.
Many of the solutions to these questions are potentially already available in the multitude of cases and discussions around Article 19 of our Constitution. For instance, at a minimum, there is agreement that such restrictions must be by way of proper legislation — and not merely executive rules. If we seek to create fair, just and resilient rules for a more civil society online, we will need a deeper engagement with these principles to flesh out the law — rather than bundle them into surprise announcements.
For existing intermediaries, Part II of the new rules create a large set of fresh due diligence requirements and redress mechanisms to be complied with to claim a safe harbour. Many of these are good moves that speak in the language of consumer protection, privacy and grievance redress. However, several commentators have already raised concerns about certain risky provisions that could cause harm in the name of accountability.
For instance, social media intermediaries providing messaging services appear to be required to break end-to-end encryption in order to identify users who are the “first originator” of information required in judicial or investigative proceedings. Messaging apps that promise privacy of communications will likely need to change their systems entirely to comply with such a requirement, giving up on crucial encryption technology that could render all of our communications less secure. Other requirements for intermediaries to handover information within 72 hours upon receipt of a written order from the government also create new avenues for data harvesting by the authorities.
Who will guard the guards? How will government oversight maintain accountability? Several alternative options to ensure such accountability could have been considered. For instance, the rules require significant social media intermediaries to publish compliance reports every month mentioning the details of complaints received and action taken. Could similar reports be released by monitoring authorities in government, regarding their aggregate requests for information from these intermediaries? Ultimately, the government needs to find different hammers, tools and railings to create a safe space for users. A wider toolkit is necessary for the government to build a framework that respects Indians who use these platforms and the collective online public and private spheres we are building together.
The writer is a lawyer working on inter-disciplinary research focussed on the impacts of digitisation on the lives of low-income individuals in India