Content Moderation Through Co-Regulation : Daily Current Affairs

Date: 09/11/2022

Relevance: GS-3: Internal security: challenges to internal security through communication networks, role of media and social networking sites in internal security challenges.

Key Phrases: Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, Grievance Appellate Committees (GACs), Section 69A of the IT Act, 2000, Challenges of Social media platforms, Advantages of social media, European Union (EU) Digital Services Act (DSA)

Context:

  • There has been much debate on the regulation of Social Media and other digital platforms and the issue persists as the regulatory bodies, social media companies and users are in news on a daily basis.

The current state of online ecosystem today

  • Social media platforms regularly manage user content on their website and suspend users who violate the terms and conditions of their platforms.
  • When a user’s post is taken down or their account is suspended, they can challenge such a decision.
  • Similarly, when users see harmful/ illegal content online, they can flag the issue with the platform.
  • Some platforms have complaint redressal mechanisms for addressing user grievances. For instance, Facebook set up the Oversight Board, an independent body, which scrutinizes its ‘content moderation’ practices.

What are the regulating laws?

  • Earlier it was voluntary for social media platforms to establish a grievance redressal mechanism through their terms of service.
  • The government of India introduced the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.
    • These mandate platforms to establish a grievance redressal mechanism to resolve user complaints within fixed timelines.
  • Recently, the government amended these Rules and established Grievance Appellate Committees (GACs).
    • Comprising government appointees, GACs will now sit in appeals over the platforms’ grievance redressal decisions.
    • This signifies the government’s tightening control of online speech, much like Section 69A of the IT Act.

Section 69 (A) of the Information Technology Act, 2000

  • Objective
    • It confers on the Central and State governments the power to issue directions to intercept, monitor or decrypt any information generated, transmitted, received or stored in any computer resource.
  • The grounds of invoking powers
    • In the interest of the sovereignty or integrity of India, defence of India, the security of the state.
    • Friendly relations with foreign states.
    • Public order, or for preventing incitement to the commission of any cognizable offence relating to these.
  • Process of Blocking Internet Websites
    • Section 69A, for similar reasons and grounds (as stated above), enables the Centre to ask any agency of the government, or any intermediary, to block access to the public of any information generated, transmitted, received or stored or hosted on any computer resource.
    • The term ‘intermediaries’ includes providers of telecom service, network service, Internet service and web hosting, besides search engines, online payment and auction sites, online marketplaces and cyber cafes.
    • Any such request for blocking access must be based on reasons given in writing.

Existing government control on online speech is unsustainable

  • In today’s online environment, Social media has millions of users and thus government control becomes difficult due to many other reasons as well.
    • Platforms have democratised public participation, and shape public discourse.
    • Also, large platforms have a substantial bearing on core democratic freedoms.
  • With the increasing reach of the Internet, its potential harms have also increased.
    • There is more illegal and harmful content online today.
    • Disinformation campaigns on social media during COVID-19 and hate speech against the Rohingya in Myanmar are recent examples.

Ingredients of a modern intermediary law

  • A modern intermediary law should be such which has the provisions to address the followings
    • It should accommodate necessary and proportionate government orders to remove content and must also comply with due process.
    • The law must devolve crucial social media content moderation decisions at the platform level.
    • Platforms must have the responsibility to regulate content under broad government guidelines.
  • The recent European Union (EU) Digital Services Act (DSA) can be taken as a reference.
    • The DSA regulates intermediary liability in the EU and requires government take-down orders to be proportionate and reasoned.
    • The DSA also gives intermediaries an opportunity to challenge the government’s decision to block content and defend themselves.
    • These processes will strongly secure free speech of online users.

How will the intermediary law function?

  • Instituting a co-regulatory framework based on modern law will serve three functions.
  • Platforms will retain reasonable autonomy over their terms of service:
    • Co-regulation will give the platforms flexibility to define the evolving standards of harmful content, thereby obviating the need for strict government mandates.
    • This will promote free speech online because government oversight incentivizes platforms to engage in private censorship.
    • Private censorship creates a chilling effect on user speech and also scuttles online innovation, which is the backbone of the digital economy.
  • Co-regulation aligns government and platform interests:
    • Online platforms themselves seek to promote platform speech and security so that their users have a free and safe experience.
    • It will encourage platforms to maintain social harmony and help in building healthy online environments.
    • For instance, during the pandemic, platforms took varied measures to tackle disinformation.
  • State can outsource content regulation to platforms:
    • Instituting co-regulatory mechanisms allows the state to outsource content regulation to platforms, which are better equipped to tackle modern content moderation challenges.

Way forward

  • The modality of a co-regulatory model for content moderation must be mulled over which will not only maintain the platform autonomy but also make platforms accountable for their content moderation decisions.
  • Whenever platforms remove content, or redress user grievance, their decisions must follow due process and be proportionate because the platforms as content moderators have substantial control over the free speech rights of users.
    • They must adopt processes such as notice, hearing and reasoned orders while addressing user grievances.
  • Algorithmic transparency should be emphasized because platforms use algorithm tools for de-prioritization of content to reduce the visibility of such content.
    • Users are unaware of and unable to challenge such actions as they take place through platform algorithms that are often confidential.

Conclusion

  • The 2021 Rules are a right move in the right direction which could pave way for a more comprehensive intermediary law.
  • The GACs must be done away with because they concentrate censorship powers in the hands of the government.
  • This is the perfect opportunity for the government to adopt a co-regulatory model of speech regulation of online speech and “A Digital India Act” for digital platforms is the need of the hour as the successor of the IT Act, 2000.

Source: The Hindu

Mains Question:

Q. There is a need for a more comprehensive modern digital intermediary law in place of the IT Act, 2000 to meet the challenges of the present volatile digital space. , Discuss (150 words).