‘Remove child sexual abuse material’: India warns X, YouTube, Telegram

0 93

The ministry of electronics and information technology (MeitY) on Friday issued a notice to social media platforms, including Telegram, X (formerly Twitter)), and YouTube, to remove child sexual abuse material (CSAM) from their platforms in India.

The notice also calls for the implementation of proactive measures to take down CSAM and emphasises the importance of prompt and permanent removal of CSAM.

Minister of state for electronics and IT Rajeev Chandrasekhar has been a vocal advocate for removing such harmful content from the Indian internet, ensuring this approach becomes the ministry’s policy vision.

MeitY also said that “delay in complying with the notices will result in the withdrawal of their safe harbour protection under Section 79 of the IT Act”.

“There will be ZERO tolerance for criminal & harmful content on Indian Internet. ITRules under the ITAct clearly lays down the expectation from Intermediaries: They cannot host criminal & harmful content like CSAM,” Chandrasekhar wrote on Twitter.

It was not immediately clear what prompted the government to issue a warning to the three platforms.

The notices also state that non-compliance with these requirements will be deemed a breach of Rule 3(1)(b) and Rule 4(4) of the IT Rules, 2021, said the ministry.

The Information Technology (IT) Act, 2000, provides the legal framework for addressing pornographic content, including CSAM.

Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.

However, safe harbour protection does not automatically disappear. The loss of safe harbour is determined by courts and not the executive such as the IT Ministry. For the courts to assess whether safe harbour has been lost or not, someone first has to file a case.

Rule 3(1)(b) requires all intermediaries, irrespective of their size, to make “reasonable efforts” to ensure that their platform does not contain content that is “obscene, pornographic, paedophilic” or “harmful to child” amongst other things.

Rule 4(4) requires significant social media intermediaries, that is, social media platforms with more than 50 lakh users in India, to “endeavour to deploy” technology-based solutions to proactively identify and take down CSAM.

When these rules were notified in 2021, multiple legal experts pointed out that language that includes “reasonable efforts” or “shall endeavour to” leads to ambiguity as to who would determine if reasonable efforts had been made and if the platform had endeavoured to take down CSAM.

Twitter uses PhotoDNA to detect and take down CSAM. PhotoDNA was developed by Microsoft to scan hashes (cryptographic fingerprints) of images against a database of known CSAM.

However, a New York Times report published in February said that after Elon Musk took over Twitter (now X), CSAM increased on the platform as he had fired teams that were experienced in dealing with the problem. In July, Musk reinstated a far-right Twitter account “Dom Lucre” that had earlier been suspended for posting CSAM.

YouTube uses its in-house technology called CSAI Match which works similarly. HT reached out to Google and Telegram for comment.

“We have a zero-tolerance policy on child sexual abuse material. No form of content that endangers minors is acceptable to us. We have heavily invested in the technology and teams to fight child sexual abuse and exploitation online and take swift to remove it as quickly as possible. In Q2 2023, we removed over 94,000 channels and over 2.5 million videos for violations of our child safety policies. We will continue to work with experts inside and outside of YouTube to provide minors and families with the best protections possible,” a YouTube spokesperson said responding to the ministry notice.

Telegram spokesperson Remi Vaughn responded saying, “Child abuse materials are explicitly forbidden by Telegram’s terms of service. Telegram’s moderators actively patrol public parts of the platform and accept user reports in order to remove content that breaches our terms. In the case of child abuse content, we publish daily reports about our efforts here: t.me/stopca. More than 48,000 groups and channels were removed over the month of September.”

According to the Telegram channel Vaughn, 10,312 groups and channels have been banned in the first six days of October. This is interesting because despite the requirement under IT Rules 2021, Telegram does not publish its monthly transparency report and this is thus one of the very few ways in which this platform is slightly transparent.

Leave A Reply

Your email address will not be published.