top of page
  • Sarthak Mishra

News: Supreme Court Puts a Stay on Implementation of IT Rules, 2023 

Sarthak Mishra,

Dharmashastra National Law University, Jabalpur

Significant discussion and examination have been generated by the Supreme Court of India's recent intervention to halt the implementation of the modified IT rules[1], particularly about the creation of a Fact Check Unit (FCU). This action is being taken in the face of worries about arbitrary enforcement, the effect on middlemen, and the possibility that government agencies would abuse it to suppress dissent and criticism.

Arbitrary Enforcement Concerns: The creation of the Fact Check Unit (FCU) and its function in identifying what material about the central government is incorrect are among the main issues surrounding the revised IT regulations. [2]Opponents worry that this might result in unfair assessments and the deliberate targeting of opinions or people.

Furthermore, concerns exist regarding the ambiguity and excessive scope of the definition of "fake news," which may violate constitutional rights, such as the freedom of speech and expression guaranteed by Articles 14, 19(1)(a) and (g), and 21. [3]The Shreya Singhal v. Union of India (2015) landmark ruling highlighted the necessity for any law limiting speech to be precisely defined. The amendments may fall short of this requirement because of their wide definition of "fake news," which includes stories about government actions. 

What is the impact on Intermediaries:

The revised regulations place heavy obligations on online intermediaries, such social networking sites and internet service providers, to find and remove content that the FCU has marked as fraudulent. Intermediaries may find it burdensome to comply with this duty, which could push them to exercise extreme care and over censor in an effort to preserve their legal immunity. The Information Technology Act exempts intermediaries from liability for information provided by third parties under section 79(1). [4]This exemption, however, may be withdrawn in the event the intermediaries engage in illegal activity or neglect to swiftly delete or prevent access to illegal content upon notification from the authorities. Because of this, intermediaries are forced to strike a balance between upholding their legal rights and guaranteeing adherence to the revised regulations. 

The Concerns Regarding Misuse and Impact on Democratic Discourse:

There are serious worries that government organizations may use the new IT regulations to suppress dissent and criticism, especially when it comes to government policy. The impact of these regulations on democratic debate and openness in governance is called into doubt by the lack of robust protections to prevent such misuse. 

What is a Fact Check Unit (FCU)?

The Fact Checking Unit (FCU) was made a statutory body under the Press Information Bureau (PIB) by the 2023 amendment. The FCU's job is to find and mark content on social media sites that it believes contains inaccurate information about the federal government or any of its agencies. The updated IT Rules mandate that online intermediaries, such as internet service providers and social media platforms, stop the spread of material that the FCU has classified as incorrect. If they don't comply, their legal immunity—which presently protects them from liability for content created by other parties—may be taken away.

Exemptions and Liability for Intermediaries:

The Information Technology Act of 2000's Section 2(1)(w) defines an intermediary as an organization such as network service providers, search engines, and online marketplaces. Intermediaries are free from liability for third-party information under Section 79(1) of the IT Act as long as their only responsibility is to provide access to communication networks without starting, choosing receivers for, or changing the content. Intermediaries may, however, no longer be exempt from responsibility and lose this protection if they engage in illegal activity or neglect to swiftly disable access to or remove illegal content upon notification from the government. 

In conclusion, the Supreme Court's decision to halt the implementation of the revised IT rules emphasizes the necessity of carefully analysing the issues brought up by the possibility of abuse, impact on intermediaries, and arbitrary enforcement. To create effective regulatory frameworks for the digital era, it is crucial to strike a balance between the need to combat misinformation and the protection of democratic values and constitutional rights.

What are the Major Concerns Related to the Amended IT Rules, 2023?[5]

The concerns regarding the Information Technology (IT) Rules 2021 revolve around several key points:

1. Potential Arbitrary Enforcement: Critics fear that the determination of false information by the FCU (Fact-Checking Unit) could be arbitrary, leading to selective targeting of viewpoints or individuals. This, particularly highlighted in the amendment to Rule 3(1)(b)(v), raises constitutional concerns regarding Articles 14, 19(1)(a) and (g), and 21.

2. Legal Precedent: The Supreme Court's ruling in Shreya Singhal vs Union of India (2015) emphasized that laws restricting speech must not be vague or overly broad. However, the expansion of the definition of "fake news" in Rule 3(1)(b)(v) potentially contradicts this precedent, risking arbitrary enforcement.

3. Impact on Intermediaries: Online intermediaries face increased responsibilities to monitor and remove flagged content. This may burden intermediaries and lead to over-censorship to avoid legal consequences.

4. Potential for Misuse: There's apprehension that these rules could be exploited by the government to stifle dissent or criticism, especially against government policies or officials. The lack of robust safeguards against such misuse raises concerns about the rules' impact on democratic discourse and transparency.

Way Forward

To address the concerns raised, several key steps need to be taken:

1. Ensuring Transparency and Accountability: The government must establish transparent operations for the FCU, including clearly defined criteria and procedures for identifying false information. Oversight and accountability mechanisms are crucial to prevent misuse or arbitrary enforcement.

2. Clear Guidelines and Due Process: Clear guidelines and due process mechanisms should be developed for intermediaries dealing with FCU-flagged content. This includes establishing avenues for content creators to appeal decisions and ensuring removals are based on objective criteria and evidence.

3. Legal Safeguards: Regulatory measures must adhere to constitutional principles and international human rights standards, particularly concerning freedom of speech and expression. Legal safeguards should prevent overreach and protect individuals' rights to express diverse opinions.

References

[1] Kunal Kamra v Union of India

[3]  Articles 14,19(1)(a) and 19(1)(g) of the Constitution

[5] IT Rules 2023 violate his right to carry out his profession as a political satirist under Article 19(1)(g)

69 views0 comments

コメント


bottom of page