NHRC issues advisory regarding Child Sexual Abuse Material on internet
The National Human Rights Commission (NHRC) has issued a new advisory to protect the rights of children by combating the production, distribution, and consumption of Child Sexual Abuse Material (CSAM) on the internet.
The tremendous increase in the production, distribution and consumption of CSAM needs to be imperatively addressed as it may have a lasting psychological impact on the child leading to further disruption of his/her overall development, the human rights body said.
According to the advisory, there has been a ‘colossal’ rise in CSAM on the internet across the globe, with more than 1,500 instances of publishing, storing, and transmitting CSAM being reported in 2021.
According to the NHRC, about 450,207 cases of CSAM have been reported in 2023 so far. 204,056 cases reported in 2022 and 163,633 cases reported in 2021.
The four-part advisory released by NHRC on Friday aims to address the legal gaps in laws concerning CSAM and offers suggestions regarding training officials and offering support to survivors of sexual abuse, creating a specialised law enforcement team to investigate CSAM crimes, and regulating internet websites to monitor and block CSAM content online.
Legal changes, addressing gaps
NHRC recommended terminology changes and suggested that the term ‘child pornography’ in the Protection of Children from Sexual Offences (POCSO) Act, 2012 should be changed to ‘Child Sexual Abuse Material (CSAM)’.
“Terms like ‘use of children in pornographic performances and materials’, ‘child sexual abuse material’ and ‘child sexual exploitation material’ to be preferred over ‘Child Pornography’”, the advisory read.
It also urged the government to redefine the term ‘sexually explicit’ in the IT Act, 2000 to ensure timely identification and removal of online CSAM.
The Centre urged for the harmonisation of laws across jurisdictions in India for arrests and asked the government to enhance punishments by making appropriate legislative changes considering the gravity of the offences.
Detection, and investigation of CSAM
The advisory recommends States and Union Territories install a Specialised State Police Unit in each state solely for the detection and investigation of CSAM-related cases and to apprehend offenders, and also a Specialized Central Police Unit in the government to deal with CSAM-related matters.
“It (Specialized Central Police Unit) should consist of experts in identification and investigation of CSAM in order to focus on identifying and apprehending CSAM offenders both in the dark web and open web and developing a comprehensive and coordinated response of investigation and law enforcement agencies towards monitoring, detection, and investigation of CSAM,” the advisory read.
The human rights body has also asked the government to form and maintain a national database of CSAM to collect data regarding the trends, prevalence, patterns, names, and other socio-economic parameters to better understand interventions. A suggestion was also made to include CSAM offenders convicted under the IT Act, 2000, and the POCSO Act, 2012 to be included in the National Database of Sex Offenders in India.
“The proposed Specialized Central Police Unit must ensure collection of disaggregated data pertaining to prevalence, trends, and patterns of CSAM, involving gender, age, caste, ethnicity, or other socio-economic parameters to better understand the issue and inform policy-based interventions,” it said.
NHRC recommended the government use technology like hotspot mapping and predictive policing to identify repeat offenders and to incentivize the development of technological tools to detect CSAM through hackathons and grants as well.
Sensitisation, awareness, and victim support
The advisory further recommended training courses and sensitisation of prosecutors, judges, police officials, and all those directly involved in the handling of CSAM cases.
“Police officials dealing with cases pertaining to CSAM to be imparted sensitisation training on rights of children in the digital environment, their specific vulnerabilities on the Internet, the extent and emerging manifestations of CSAM and the use of child-friendly procedures in investigation,” it said.
The advisory also encouraged awareness and sensitisation of parents and children in schools, colleges, and other institutions to increase awareness to recognize the signs of online child abuse.
Psycho-social care centres for survivors of CSAM have also been recommended.
Regulations regarding social media, OTT platforms
The NHRC has recommended that all internet intermediaries should develop a CSAM-specific policy outlining an in-house reporting mechanism and how they’ll use technology to detect and remove CSAM from their platforms.
“Intermediaries, including Social Media Platforms, Over-The-Top (OTT) applications and Cloud Service Providers, must deploy technology, including content moderation algorithms, to proactively detect CSAM on their platforms and remove the same,” it said.
Similarly, the media platforms were advised to speed up the time taken for the removal of CSAM content from their platforms and that they should explore partnerships amongst themselves and the government to ensure real-time sharing of information concerning CSAM content on the internet.
Further, the NHRC has asked all concerned authorities of the Union/State Government(s)/ UT Administration(s) to implement the recommendations contained in the advisory and send an Action Taken Report (ATR) within two months for information of the Commission.