For this article:

30 Dec 2025·Source: The Indian Express
2 min
Polity & GovernancePolity & GovernanceSocial IssuesNEWS

Government Directs Tech Giants to Curb Obscene Content on Online Platforms

Centre warns tech companies against hosting obscene content, emphasizing user safety and legal compliance.

Government Directs Tech Giants to Curb Obscene Content on Online Platforms

Photo by Ling App

The Ministry of Electronics and Information Technology (MeitY) has issued a strong advisory to social media and other online platforms, urging them to strictly comply with existing laws against hosting obscene, vulgar, or sexually explicit content. The government emphasized that platforms must ensure user safety and trust, especially for women and children, and remove such content within 24 hours of a complaint. This move underscores the government's commitment to regulating digital content and holding intermediaries accountable under the IT Act, 2000, and its subsequent rules, particularly concerning online safety and ethical content standards.

मुख्य तथ्य

1.

Advisory issued by MeitY

2.

Target: Social media and online platforms

3.

Requirement: Remove obscene content within 24 hours of complaint

4.

Legal basis: IT Act, 2000 and rules

UPSC परीक्षा के दृष्टिकोण

1.

Constitutional Law: Article 19 (Freedom of Speech and Expression vs. Reasonable Restrictions), Article 21 (Right to Life and Personal Liberty, including privacy).

2.

Statutory Law: Information Technology Act, 2000; IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021; Indian Penal Code (IPC) sections related to obscenity; Protection of Children from Sexual Offences (POCSO) Act, 2012.

3.

Governance: Role of MeitY, intermediary liability, digital regulation, cyber safety, ethical content standards, grievance redressal mechanisms.

4.

Social Justice: Protection of women and children online, combating online harassment and exploitation.

दृश्य सामग्री

Evolution of Digital Content Regulation in India

This timeline illustrates the key legislative and policy milestones that have shaped digital content regulation and intermediary accountability in India, leading up to the current government advisory.

The regulatory landscape for digital content in India has evolved significantly, moving from initial recognition of e-commerce to comprehensive frameworks for cybersecurity, data protection, and intermediary accountability, driven by rapid technological advancements and increasing online harms.

  • 1970Department of Electronics (DoE) established, marking India's initial focus on technology.
  • 2000Information Technology Act, 2000 enacted, providing legal framework for e-commerce and cybercrime. Introduced Section 79 on intermediary liability.
  • 2008IT Act amended to strengthen cyber security provisions and address new forms of cybercrime.
  • 2015Supreme Court's Shreya Singhal v. Union of India judgment struck down Section 66A of IT Act and clarified Section 79 (intermediary liability).
  • 2016Ministry of Electronics and Information Technology (MeitY) carved out, consolidating focus on digital governance.
  • 2021Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 notified, significantly expanding intermediary obligations.
  • 2023Digital Personal Data Protection Act, 2023 enacted, strengthening data privacy and protection framework.
  • 2024Discussions and drafts for the proposed Digital India Act (DIA) gain momentum, aiming to replace the IT Act, 2000.
  • 2025MeitY issues strong advisory to tech giants to curb obscene content, emphasizing 24-hour removal and intermediary accountability.

Online Content Grievance Redressal & Removal Process (IT Rules, 2021)

This flowchart illustrates the mandated process for handling user complaints regarding unlawful content, particularly obscene material, on online platforms as per the IT Rules, 2021, and highlighted by the recent MeitY advisory.

  1. 1.User encounters unlawful content (e.g., obscene, explicit)
  2. 2.User files complaint with Intermediary's Grievance Officer
  3. 3.Grievance Officer acknowledges complaint within 24 hours
  4. 4.Is content related to sexually explicit material or child sexual abuse material?
  5. 5.Intermediary removes content within 24 hours of complaint
  6. 6.For other unlawful content, Grievance Officer resolves complaint within 15 days
  7. 7.Intermediary complies with due diligence requirements (IT Rules, 2021)
  8. 8.Intermediary retains 'safe harbour' protection (Section 79, IT Act)
  9. 9.Intermediary fails to comply with due diligence/removal timelines
  10. 10.Intermediary loses 'safe harbour' protection & becomes liable for third-party content
और जानकारी

पृष्ठभूमि

India has a long history of regulating content, from print media and cinema to broadcasting, and now digital platforms. The Information Technology Act, 2000, was a landmark legislation to govern cyber activities.

With the rapid proliferation of social media and user-generated content, challenges related to harmful content (hate speech, misinformation, obscenity) have grown significantly. This led to the formulation of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, which aimed to enhance platform accountability and user safety.

नवीनतम घटनाक्रम

The Ministry of Electronics and Information Technology (MeitY) has issued a strong advisory to social media and other online platforms, reiterating the need for strict compliance with existing laws against hosting obscene, vulgar, or sexually explicit content. The advisory emphasizes the 24-hour content removal timeline upon receiving a complaint and underscores the government's commitment to holding intermediaries accountable for ensuring user safety and trust, particularly for women and children. This move highlights the ongoing efforts to regulate digital content and enforce ethical standards in the online space.

बहुविकल्पीय प्रश्न (MCQ)

1. Consider the following statements regarding the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021: 1. They mandate social media intermediaries to remove content depicting nudity or sexual acts within 24 hours of receiving a complaint. 2. The rules define 'significant social media intermediary' based on the number of registered users in India. 3. Intermediaries are granted absolute immunity from liability for third-party content hosted on their platforms, provided they comply with due diligence. Which of the statements given above is/are correct?

उत्तर देखें

सही उत्तर: B

Statement 1 is correct. The IT Rules, 2021, mandate intermediaries to remove content depicting nudity, sexual acts, or morphed images within 24 hours of receiving a complaint. Statement 2 is correct. The rules define 'significant social media intermediary' based on a threshold of registered users in India, which is currently 50 lakh users. Statement 3 is incorrect. Intermediaries are granted conditional immunity (safe harbor) from liability for third-party content, not absolute immunity. This immunity is contingent upon their compliance with due diligence requirements and the rules prescribed by the government. Failure to comply can lead to loss of safe harbor protection.

2. In the context of regulating online content in India, which of the following statements correctly reflects the legal position concerning freedom of speech and expression?

उत्तर देखें

सही उत्तर: A

Option A is correct. The Supreme Court in Shreya Singhal v. Union of India (2015) struck down Section 66A of the IT Act, 2000, for being vague and overbroad, thus violating freedom of speech. However, it upheld the constitutional validity of Section 79, which provides for intermediary liability with 'safe harbor' provisions, provided intermediaries observe due diligence. Option B is incorrect. Article 19(2) lists 'decency or morality' as a ground for reasonable restriction, under which obscenity is generally covered, but 'obscenity' itself is not explicitly listed as a separate ground. Option C is incorrect. Government directives for content removal, especially for illegal content, are permissible under reasonable restrictions outlined in Article 19(2) and relevant laws, and do not automatically amount to a violation of Article 19(1)(a) if they meet the test of reasonableness. Option D is incorrect. The IT Act, 2000, and its subsequent rules (like IT Rules 2021) provide conditional immunity (safe harbor) to online platforms, not complete immunity. This immunity is contingent upon their compliance with due diligence and other prescribed rules.

3. With reference to the protection of children and women from online harmful content in India, consider the following provisions: 1. The Protection of Children from Sexual Offences (POCSO) Act, 2012, specifically addresses child pornography and related online offences. 2. Section 67 of the Information Technology Act, 2000, penalizes the publication or transmission of obscene material in electronic form. 3. The National Commission for Protection of Child Rights (NCPCR) is a statutory body empowered to inquire into complaints regarding violation of child rights, including online safety. How many of the statements given above are correct?

उत्तर देखें

सही उत्तर: C

Statement 1 is correct. The POCSO Act, 2012, is a comprehensive law that specifically deals with child sexual abuse, including offences related to child pornography (Section 14 and 15) and its online dissemination. Statement 2 is correct. Section 67 of the IT Act, 2000, provides for punishment for publishing or transmitting obscene material in electronic form. Subsequent sections (67A, 67B) deal with sexually explicit acts and child pornography, respectively. Statement 3 is correct. The NCPCR is a statutory body established under the Commissions for Protection of Child Rights Act, 2005. It is mandated to protect, promote, and defend child rights, and its functions include inquiring into complaints regarding violation of child rights, which extends to online safety and protection from harmful content.

GKSolverआज की खबरें