Regulation of Online Content
Freedom of Speech and Expression vs. Content Regulation
Shreya Singhal v. Union of India (Struck down Section 66A)
The Indian Constitution guarantees freedom of speech and expression under Article 19(1)(a). However, this freedom is subject to reasonable restrictions under Article 19(2), such as public order, decency, and morality. In the digital context, this balance becomes crucial.
Section 66A of the IT Act, 2000 criminalized sending offensive messages through communication services. It was criticized for being vague and overbroad, leading to misuse and arrest for mere online expression.
In the landmark judgment Shreya Singhal v. Union of India (2015), the Supreme Court held Section 66A unconstitutional for violating Article 19(1)(a) and not falling within the reasonable restrictions of Article 19(2).
Impact of the Judgment
- Strengthened digital free speech protections
- Prevented arbitrary arrests for online posts
- Set a precedent for evaluating online speech laws
However, online speech remains subject to other laws, especially when it concerns national security, defamation, or obscenity.
Obscene Publications and Pornography
Section 67, 67A, 67B of IT Act
The Information Technology Act, 2000 provides detailed provisions to curb the circulation of obscene content online.
- Section 67: Punishes publishing or transmitting obscene material in electronic form. First offence: up to 3 years imprisonment + ₹5 lakh fine; subsequent: up to 5 years + ₹10 lakh fine.
- Section 67A: Specifically deals with sexually explicit content. Stricter penalties: up to 5 years imprisonment + ₹10 lakh fine for first offence.
- Section 67B: Addresses child pornography. Publishing, browsing, or downloading such content is punishable with imprisonment up to 5 years + ₹10 lakh fine.
Indecent Representation of Women (Prohibition) Act, 1986
This Act prohibits the indecent representation of women in any form (including digital). It defines “indecent representation” as the depiction of the figure of a woman in a manner likely to deprave, corrupt or injure public morality.
It complements the IT Act by:
- Enabling action against portrayal of women in obscene or derogatory ways
- Holding advertisers, publishers, and platforms accountable
Defamation Online
Criminal and Civil Liability
Online defamation involves publishing defamatory statements on digital platforms like social media, blogs, or websites. It is covered under both:
- Section 499–500 of IPC: Criminal defamation – punishable with imprisonment up to 2 years or fine or both.
- Tort Law: Civil suits for monetary compensation.
Ingredients of Defamation (Online or Offline)
- The statement must be false
- It must lower the reputation of the person in the eyes of others
- It must be published or communicated to a third person
Examples: Posting morphed photos, false allegations on Facebook, defamatory YouTube videos, fake news blogs.
Platform Liability
Under Section 79 of the IT Act, intermediaries like Twitter or Instagram enjoy safe harbour protection only if they act promptly upon receiving takedown notices or court orders.
Hate Speech and Incitement to Violence
Legal Provisions to Regulate Hate Speech
Hate speech online includes posts or videos that promote enmity, hatred, or violence based on religion, caste, gender, or region. It’s a major concern in India due to its diverse and sensitive social fabric.
Relevant legal provisions:
- Section 153A of IPC: Promoting enmity between groups
- Section 295A of IPC: Deliberate insult to religion
- Section 505(1)(b) of IPC: Making statements causing fear or alarm
- Section 66F of IT Act: Covers cyber terrorism, which includes promoting fear and disrupting peace
Challenges in Enforcement
- Difficulty in determining intent and context
- Use of anonymous accounts and VPNs
- Conflicts between free speech and regulation
- Viral nature of content before takedown
Conclusion
While regulating online content is essential for public order, security, and decency, it must be balanced with constitutional freedoms. Clear laws, judicial oversight, and platform responsibility are crucial to ensure fairness and effectiveness in content moderation.