Speaking on the occasion of National Press Day, the minister emphasised the need for greater accountability, highlighting the challenges posed by fake news and disinformation. Vaishnaw said, “Since the social media companies don’t take responsibility for what is posted on the platform, who will take responsibility?”
He cited how, even in developed countries, many instances of riots, terrorism, and interference occurred because social media companies shied away from their responsibilities. He added that in India’s diverse culture, we need to be “extra careful.”
In India, Section 79 of the IT Act exempts social media companies from liability for content posted by users.
He took aim at social media companies, slamming algorithms that are configured to prioritise content that evokes strong reactions, irrespective of factual accuracy. He said that the approach by social media companies is “irresponsible” and “dangerous for society.” He stated that misinformation combined with algorithmic bias can have serious consequences for a culturally diverse country like India.
Ashwini Vaishnaw also proposed that social media companies pay compensation to conventional media for content created by the latter. The minister argued that conventional media is losing out financially.
He reasoned that conventional news outlets invest greatly in creating and training journalists, but news consumption is shifting to social media, making those investments irrelevant. This, he said, gives social media companies an “unequal edge” in bargaining power.
Let’s understand the safe harbour provisions
The safe harbour provisions refer to legal protections that shield certain entities from liability for actions or content produced by third parties. In the context of social media platforms, these provisions are designed to protect companies like Facebook, Twitter, or YouTube from being held responsible for content that users post on their platforms.
In India, the safe harbour provisions are outlined in Section 79 of the Information Technology (IT) Act, 2000. This section provides that intermediaries (such as social media platforms, internet service providers, and online marketplaces) will not be held liable for user-generated content unless they have knowledge of or control over the illegal content. The key condition is that the intermediary must act in a “neutral” capacity—i.e., they cannot be seen as the creator or publisher of the content.
Key points of safe harbour provisions:
Exemption from Liability: Social media platforms and other intermediaries are not liable for user-generated content as long as they act as neutral intermediaries.
Notice and Takedown: If the platform is notified of illegal content, they must take down or disable access to that content. This is typically the process for removing defamatory, illegal, or harmful content once the platform is made aware of it.
Due Diligence: Intermediaries must follow a due diligence framework set by the government to maintain the safe harbour protection.
The purpose of the safe harbour provisions is to promote the growth of the internet and online platforms without making them overly burdened with the responsibility of monitoring all content uploaded by users. However, there is ongoing debate, especially in the context of disinformation and harmful content, about whether these provisions are too lenient and need to be revisited.