An article from Economic Times Report According to this, the government may ask social media companies to provide detailed information on the steps taken to prevent obscene and child sexual abuse content on their platforms. The government had issued a notice to these platforms on 6 October. The report states that in the notice sent by Meity, it had asked these companies to permanently block such content which is related to these sensitive issues.
The government wants social media companies to implement technical measures on their platforms, including an automated tool that can identify such content and block them permanently.
The report says the government has warned that in case of non-compliance, those companies risk losing the safe harbor provision given to them under the Information Technology Rules 2021. The rules state that all social media intermediaries must not only deploy “advanced measures, including automated tools” to block “obscene, pedophilic” content, but also proactively identify any information that ” Depicts acts like rape, child sexual abuse in any form.
According to the report, this notice was followed by a response from YouTube and Telegram, in which they said that they have a “zero tolerance” policy for obscene and child sexual exploitation content on their platforms and that they have taken steps to fight online child sexual exploitation. Have invested heavily in technology and teams.
The platforms say they removed more than 94,000 channels and more than 2.5 million videos for violating their child safety policies in the second quarter of 2023.