For this article:

31 Dec 2025·Source: The Hindu
3 min
Polity & GovernancePolity & GovernanceScience & TechnologyNEWS

Government Directs Social Media Platforms to Proactively Remove Obscene Content

Government mandates social media platforms to proactively remove obscene content, citing IT Rules and Supreme Court's concerns.

Government Directs Social Media Platforms to Proactively Remove Obscene Content

Photo by Kelly Sikkema

The Ministry of Electronics and Information Technology (MeitY) has issued an advisory directing social media platforms, especially large ones with over 50 lakh users, to proactively detect and remove "obscene" and "pornographic" content. This directive emphasizes compliance with the IT Rules, 2021, which prohibit such objectionable content.

The move follows a Supreme Court urging the Centre to act on "obscenity" on the Internet and comes after the government blocked nearly 25 home-grown OTT platforms specializing in erotic content. Non-compliance could lead to action against platforms and intermediaries, highlighting the government's intensified efforts to regulate online content and ensure a safer digital environment.

Key Facts

1.

MeitY issued advisory to social media platforms

2.

Platforms with over 50 lakh users must use technology to detect/remove content

3.

Directive cites IT Rules, 2021

4.

Follows Supreme Court's urging on 'obscenity'

5.

Nearly 25 home-grown OTT platforms blocked for erotic content

UPSC Exam Angles

1.

Constitutional provisions related to freedom of speech and its restrictions.

2.

Legal framework for IT and digital content regulation (IT Act, IT Rules 2021).

3.

Role and responsibilities of social media intermediaries.

4.

Government's power to block content and platforms.

5.

Judicial intervention and its impact on policy-making.

6.

Challenges of content moderation in the digital age.

7.

Balancing fundamental rights with public interest and safety.

Visual Insights

Online Content Moderation & Compliance under IT Rules, 2021

Illustrates the process of content moderation by social media intermediaries as mandated by the IT Rules, 2021, and the recent government directive to proactively remove obscene content.

  1. 1.MeitY Advisory / Supreme Court Urging
  2. 2.Social Media Platforms (SSMIs > 50 Lakh Users)
  3. 3.Proactive Detection & Removal of Obscene/Pornographic Content
  4. 4.User Complaint / Government Notification / Court Order
  5. 5.Grievance Officer (Acknowledge in 24 hrs, Resolve in 15 days)
  6. 6.Remove/Disable Access to Unlawful Content (within 36 hours)
  7. 7.Compliance with IT Rules, 2021?
  8. 8.YES: Retain 'Safe Harbour' Protection (Sec 79, IT Act)
  9. 9.NO: Loss of 'Safe Harbour' & Action against Platform/Intermediary

Key Metrics of Online Content Regulation (IT Rules, 2021)

Highlights critical numerical thresholds and timelines from the IT Rules, 2021, relevant to the government's directive on content moderation.

Significant Social Media Intermediary (SSMI) User Threshold
50 Lakh+

Platforms with over 5 million registered users face enhanced due diligence obligations, including appointing key compliance officers. This is central to the current advisory.

Grievance Acknowledgment Timeline
24 Hours

Time limit for Grievance Officers to acknowledge receipt of a user complaint. Ensures prompt initial response to user concerns.

Grievance Resolution Timeline
15 Days

Maximum time for Grievance Officers to resolve a user complaint. Aims for speedy resolution of online issues.

Unlawful Content Removal Timeline (Court/Govt Order)
36 Hours

Time limit for intermediaries to remove or disable access to unlawful content (e.g., sexually explicit material) upon receiving a court order or government notification. This is directly relevant to the current advisory.

More Information

Background

The regulation of online content, particularly 'obscene' and 'pornographic' material, has been a long-standing challenge for governments worldwide. In India, the legal framework primarily stems from the Information Technology Act, 2000, and its subsequent amendments, notably the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021.

These rules aim to make social media platforms and other intermediaries more accountable for the content hosted on their platforms. The issue often involves a delicate balance between freedom of speech and expression (Article 19(1)(a)) and reasonable restrictions (Article 19(2)) related to public order, decency, morality, and incitement to an offence.

Latest Developments

The Ministry of Electronics and Information Technology (MeitY) has issued a fresh advisory directing social media platforms, especially those with over 50 lakh users, to proactively detect and remove 'obscene' and 'pornographic' content. This move reinforces compliance with the IT Rules, 2021, and follows a Supreme Court's call for government action on online obscenity.

The government has also previously blocked several OTT platforms specializing in erotic content, indicating an intensified regulatory approach. Non-compliance by platforms could lead to legal action, underscoring the government's commitment to fostering a safer digital environment.

Practice Questions (MCQs)

1. Consider the following statements regarding the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: 1. These rules mandate social media intermediaries to appoint a Chief Compliance Officer, a Nodal Contact Person, and a Resident Grievance Officer. 2. Significant social media intermediaries are required to enable the identification of the 'first originator' of information for specific purposes. 3. The rules provide for a three-tier grievance redressal mechanism for digital media publishers, with the Ministry of Information and Broadcasting as the final appellate body. Which of the statements given above is/are correct?

  • A.1 and 2 only
  • B.2 and 3 only
  • C.1 and 3 only
  • D.1, 2 and 3
Show Answer

Answer: D

Statement 1 is correct: Rule 4(1) of the IT Rules, 2021, mandates significant social media intermediaries to appoint a Chief Compliance Officer, a Nodal Contact Person, and a Resident Grievance Officer. Statement 2 is correct: Rule 4(2) requires significant social media intermediaries providing services primarily in the nature of messaging to enable the identification of the first originator of information for specific purposes like prevention, detection, investigation, prosecution or punishment of an offence related to the sovereignty and integrity of India, public order, etc. Statement 3 is correct: The rules establish a three-tier grievance redressal mechanism for digital media publishers, including self-regulation by publishers, self-regulatory bodies of publishers, and an oversight mechanism by the Ministry of Information and Broadcasting.

2. In the context of content regulation on the internet in India, which of the following statements is NOT correct?

  • A.The power to block public access to information from any computer resource is primarily derived from Section 69A of the Information Technology Act, 2000.
  • B.The Supreme Court of India, in the Shreya Singhal v. Union of India case, struck down Section 66A of the IT Act, 2000, for being vague and overbroad.
  • C.The recent MeitY advisory directs platforms to proactively remove content, which implies a shift from a 'notice and takedown' regime to a 'proactive monitoring' requirement for intermediaries.
  • D.The concept of 'safe harbour' for intermediaries under Indian law provides absolute immunity from liability for third-party content, irrespective of their knowledge or action.
Show Answer

Answer: D

Statement D is NOT correct. The 'safe harbour' protection for intermediaries under Section 79 of the IT Act, 2000, is not absolute. It is conditional upon the intermediary observing 'due diligence' and not conspiring or abetting the commission of the unlawful act. The IT Rules, 2021, further elaborate on these due diligence requirements, including the removal of unlawful content upon notice or proactive removal in certain cases. Therefore, intermediaries are not immune if they have knowledge of unlawful content and fail to act. Statements A, B, and C are correct. Section 69A grants blocking powers. Shreya Singhal case struck down Section 66A. The MeitY advisory indeed pushes for proactive removal, moving beyond mere 'notice and takedown' for certain types of content.

3. Which of the following fundamental rights is most directly impacted by government directives requiring social media platforms to proactively remove 'obscene' and 'pornographic' content?

  • A.Right to Equality (Article 14)
  • B.Right to Freedom of Speech and Expression (Article 19(1)(a))
  • C.Right to Life and Personal Liberty (Article 21)
  • D.Right to Constitutional Remedies (Article 32)
Show Answer

Answer: B

The directive to remove 'obscene' and 'pornographic' content directly impacts the Right to Freedom of Speech and Expression (Article 19(1)(a)). While this right is not absolute and is subject to reasonable restrictions under Article 19(2) (including in the interests of decency or morality), any government action to regulate content on platforms necessarily involves a potential restriction on this freedom. The challenge lies in balancing this freedom with the need to maintain public order and morality. The other rights, while important, are not as directly or primarily impacted by content moderation directives.

4. Consider the following statements regarding the legal definition and regulation of 'obscenity' in India: 1. The Indian Penal Code, 1860, specifically Section 292, criminalizes the sale, distribution, or public exhibition of obscene books, pamphlets, or figures. 2. The Supreme Court of India, in the Ranjit Udeshi v. State of Maharashtra case, adopted the 'Hicklin Test' to determine obscenity, focusing on the tendency to deprave and corrupt those whose minds are open to such immoral influences. 3. The Information Technology Act, 2000, defines 'pornographic material' and provides for stricter penalties for its publication or transmission in electronic form, especially involving children. Which of the statements given above is/are correct?

  • A.1 and 2 only
  • B.2 and 3 only
  • C.1 and 3 only
  • D.1, 2 and 3
Show Answer

Answer: D

Statement 1 is correct: Section 292 of the IPC deals with obscenity, criminalizing the sale, distribution, public exhibition, etc., of obscene material. Statement 2 is correct: The Supreme Court in Ranjit Udeshi v. State of Maharashtra (1965) indeed applied the 'Hicklin Test' (from a 19th-century English case) to determine obscenity, which considers whether the material has a tendency to deprave and corrupt those whose minds are open to such immoral influences. Statement 3 is correct: The IT Act, 2000, particularly Sections 67, 67A, and 67B, specifically addresses the publication and transmission of obscene and sexually explicit material in electronic form, with enhanced penalties for child pornography. The recent MeitY advisory specifically mentions 'obscene' and 'pornographic' content, making these legal provisions highly relevant.

GKSolverToday's News