For this article:

31 Dec 2025·Source: The Hindu
2 min
Polity & GovernancePolity & GovernanceSocial IssuesNEWS

Government to Mandate Social Media Platforms to Remove Obscene Content

New government rules will compel social media platforms to remove obscene content, impacting digital governance.

Government to Mandate Social Media Platforms to Remove Obscene Content

Photo by Mariia Shalabaieva

The government is set to issue new rules that will mandate social media platforms to take down obscene content. This move aims to enhance online safety and regulate digital spaces more effectively.

The rules will likely empower the government to direct platforms to remove content deemed objectionable, potentially impacting freedom of speech and the operational autonomy of social media companies. This development is crucial for understanding the evolving landscape of digital governance in India, balancing user rights with the need for a safe online environment.

मुख्य तथ्य

1.

Government to issue new rules for social media platforms.

2.

Rules will mandate removal of obscene content.

3.

The move is aimed at enhancing online safety.

UPSC परीक्षा के दृष्टिकोण

1.

Constitutional Law: Article 19 (Freedom of Speech and Expression) and its reasonable restrictions (Article 19(2) - decency, morality, public order).

2.

Information Technology Act, 2000: Section 79 (Intermediary Liability) and Section 69A (Power to issue directions for blocking public access).

3.

IT Rules, 2021: Obligations on social media intermediaries, grievance redressal mechanism.

4.

Digital Governance: Balancing user rights, platform responsibilities, and government oversight.

5.

Cyber Security and Online Safety: Protection of vulnerable groups, particularly women and children, from harmful content.

6.

Judicial Precedents: Shreya Singhal case, Ranjit D. Udeshi case (on obscenity).

दृश्य सामग्री

Evolution of Digital Content Regulation in India (2000-2025)

This timeline illustrates the key legislative and policy milestones in India's journey to regulate digital content and ensure online safety, leading up to the government's latest mandate on obscene content removal.

India's digital governance framework has evolved significantly from its inception in 2000, adapting to technological advancements and emerging online challenges. The journey reflects a continuous effort to balance innovation, user rights, and online safety, leading to the current push for stricter content moderation.

  • 2000Information Technology Act, 2000 enacted: Primary law for cybercrime & e-commerce, includes Section 67 on obscene material.
  • 2008IT Act Amendment: Introduced Sections 67A (explicit acts) and 67B (CSAM), strengthened cybercrime provisions, and refined intermediary liability (Section 79).
  • 2014Supreme Court's Aveek Sarkar judgment: Shifted obscenity test from 'Hicklin' to 'community standards', impacting content moderation.
  • 2021Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 introduced: Mandated due diligence for intermediaries, grievance redressal, and specific timelines for content removal (e.g., 24 hours for sexually explicit content).
  • 2023Increased focus on Deepfakes & Misinformation: Government and judiciary express concerns over AI-generated harmful content, pushing for stricter platform accountability.
  • 2024Ongoing Legal Challenges to IT Rules 2021: Various provisions, including traceability and content moderation powers, face scrutiny in High Courts.
  • 2025Government to Mandate Social Media Platforms to Remove Obscene Content: New rules expected to empower government to direct platforms, enhancing online safety but raising free speech concerns (Current News).
  • 2025Discussions on Digital India Act (DIA): Proposed successor to IT Act, aiming for a comprehensive digital regulatory framework, including updated intermediary liability and data protection.

Digital Governance: Balancing Online Safety & Freedom of Speech

This mind map illustrates the core tension and interconnected aspects involved in the government's move to mandate content removal, highlighting the stakeholders and their concerns.

Digital Content Regulation (2025 Mandate)

  • Online Safety (Govt. Priority)
  • Freedom of Speech (User/Platform Concern)
  • Intermediary Liability (Platforms)
  • Legal Framework (Government Tool)
और जानकारी

पृष्ठभूमि

The regulation of online content in India has evolved significantly, primarily governed by the Information Technology (IT) Act, 2000. Landmark judgments like Shreya Singhal v. Union of India (2015) struck down Section 66A of the IT Act, emphasizing the importance of freedom of speech online.

However, the need for a safe online environment, especially concerning obscene and harmful content, has led to subsequent amendments and rules, notably the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules, 2021). These rules placed greater obligations on social media intermediaries regarding content moderation and grievance redressal.

नवीनतम घटनाक्रम

The government is poised to introduce new rules that will explicitly mandate social media platforms to remove obscene content. This initiative is framed as a measure to enhance online safety and regulate digital spaces more effectively.

The proposed rules are expected to empower the government to issue direct instructions to platforms for content removal, potentially intensifying the debate around freedom of speech, intermediary liability, and the operational autonomy of tech companies. This move reflects a global trend of governments seeking greater control over digital content.

बहुविकल्पीय प्रश्न (MCQ)

1. Consider the following statements regarding the regulation of online content in India: 1. Section 79 of the Information Technology Act, 2000, grants 'safe harbour' protection to intermediaries, making them immune from liability for third-party content under certain conditions. 2. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, mandate significant social media intermediaries to appoint a Chief Compliance Officer, a Nodal Contact Person, and a Resident Grievance Officer. 3. The power to block public access to information through any computer resource is exclusively vested with the Supreme Court of India. Which of the statements given above is/are correct?

उत्तर देखें

सही उत्तर: B

Statement 1 is correct. Section 79 of the IT Act provides 'safe harbour' to intermediaries if they observe due diligence. Statement 2 is correct. The IT Rules, 2021, introduced these mandatory appointments for significant social media intermediaries. Statement 3 is incorrect. The power to block public access to information is vested with the Central Government under Section 69A of the IT Act, 2000, not exclusively with the Supreme Court.

2. In the context of freedom of speech and expression in India, which of the following statements is/are correct? 1. Article 19(1)(a) of the Constitution guarantees absolute freedom of speech and expression to all citizens. 2. 'Obscenity' is explicitly listed as a ground for imposing reasonable restrictions on freedom of speech and expression under Article 19(2). 3. The Supreme Court, in the Ranjit D. Udeshi case (1965), adopted the 'Hicklin Test' to determine obscenity in India, focusing on content that tends to deprave and corrupt those whose minds are open to such immoral influences. Select the correct answer using the code given below:

उत्तर देखें

सही उत्तर: B

Statement 1 is incorrect. Freedom of speech and expression under Article 19(1)(a) is not absolute and is subject to reasonable restrictions under Article 19(2). Statement 2 is incorrect. Article 19(2) lists 'decency or morality' as a ground for restriction, which encompasses obscenity, but 'obscenity' itself is not explicitly listed as a separate ground. Statement 3 is correct. The Ranjit D. Udeshi case is a landmark judgment where the Supreme Court adopted the Hicklin Test for obscenity, though later judgments have nuanced this approach.

3. Which of the following is NOT a direct implication or concern associated with the government mandating social media platforms to remove 'obscene content'?

उत्तर देखें

सही उत्तर: C

Options A, B, and D are all valid implications or concerns. A) Over-regulation or vague definitions can lead platforms to err on the side of caution, removing content that might not be truly obscene, thus chilling legitimate expression. B) Moderating vast amounts of content, especially with subjective definitions, requires significant resources and poses technical challenges for platforms. D) The definition of 'obscene' can be subjective and culturally sensitive, leading to arbitrary decisions by platforms or government agencies. Option C is NOT a direct implication. While content removal relates to online safety, it does not inherently enhance user data privacy. Data privacy is a separate aspect of digital governance, addressed by laws like the upcoming Data Protection Bill, and is not directly improved by content removal mandates.

Source Articles

GKSolverआज की खबरें