The NHRC Publishes A Warning Against Online Child Sexual Abuse Material

0

A new recommendation has been released by the National Human Rights Commission (NHRC) to safeguard children’s rights by preventing the creation, dissemination, and use of Child Sexual Abuse Material (CSAM) online.

According to the human rights group, there is an urgent need to address the massive increase in the manufacture, distribution, and use of CSAM because it may have a long-lasting psychological effect on children and further damage their overall development.

The alert claims that there has been a “colossal” increase in CSAM on the internet worldwide, with over 1,500 reports of CSAM publishing, storing, and transmitting made in 2021.

Approximately 450,207 cases of CSAM have been documented in 2023 thus far, according to the NHRC. There were 163,633 cases reported in 2021 and 204,056 cases in 2022.

In addition to making recommendations for regulating internet websites to monitor and block CSAM content online, forming a specialized law enforcement team to investigate CSAM crimes, and providing training for officials and providing support to survivors of sexual abuse, the four-part advisory released by the NHRC on Friday aims to address the legal gaps in laws concerning CSAM.

Changing The Law And Filling In The Gaps

The NHRC proposed changing the word “child pornography” in the Protection of Children from Sexual Offenses (POCSO) Act, 2012 to “Child Sexual Abuse Material (CSAM)” in order to improve vocabulary.

The advise stated that phrases like “child sexual abuse material,” “child sexual exploitation material,” and “use of children in pornographic performances and materials” should be used instead of “child pornography.”

Additionally, it asked that the IT Act of 2000‘s definition of “sexually explicit” be revised in order to guarantee the prompt detection and elimination of online CSAM.

The Center requested that rules pertaining to arrests be harmonized all Indian jurisdictions and that the government strengthen penalties by amending the relevant legislation in light of the seriousness of the offenses.

Identification And Analysis Of CSAM

The recommendation advises States and Union Territories to establish a Specialized Central Police Unit within the government to handle CSAM-related issues, as well as a Specialized State Police Unit in each state specifically for the purpose of detecting and investigating CSAM-related instances and apprehending criminals.

“The Specialized Central Police Unit should be composed of professionals who specialize in identifying and investigating cyberstalking (CSAM) in order to concentrate on finding and capturing CSAM offenders on both the open and dark web as well as creating a thorough and well-coordinated response from law enforcement and investigation agencies to the monitoring, detection, and investigation of CSAM,” the advisory stated.

In order to better understand interventions, the human rights committee has also requested that the government create and maintain a national database of CSAM, which will gather information on names, patterns, trends, and other socioeconomic indicators. Additionally, it was suggested that CSAM offenders convicted under the POCSO Act of 2012 and the IT Act of 2000 be added to India’s National Database of Sex Offenders.

It stated that in order to better understand the problem and guide policy-based interventions, “the proposed Specialized Central Police Unit must ensure collection of disaggregated data pertaining to prevalence, trends, and patterns of CSAM, involving gender, age, caste, ethnicity, or other socio-economic parameters.”

The NHRC advised that in addition to using technology like as hotspot mapping and predictive policing to identify repeat offenders, the government should also use grants and hackathons to encourage the creation of technical solutions for the detection of CSAM.

Victim Support, Awareness, And Sensitization

The recommendation also suggested that judges, prosecutors, police officers, and anyone else directly involved in the administration of CSAM cases take training programs and become more sensitive to the issues.

It stated that “sensitization training on the rights of children in the digital environment, their specific vulnerabilities on the Internet, the extent and emerging manifestations of CSAM and the use of child-friendly procedures in investigation” would be provided to police officials handling cases pertaining to CSAM.

In order to raise awareness and sensitize parents and kids in schools, universities, and other institutions to the warning signals of online child abuse, the advisory also promoted awareness-raising and sensitization.

Additionally, centers for psycho-social care for CSAM survivors have been suggested.

Regulations Pertaining To OTT Platforms And Social Media

All internet intermediaries should create a CSAM-specific policy that describes their internal reporting system and how they plan to employ technology to identify and eliminate CSAM from their platforms, according to the NHRC’s recommendation.

“Technology, including content moderation algorithms, must be deployed by intermediaries, such as social media platforms, over-the-top (OTT) applications, and cloud service providers, to proactively detect and remove content sponsored by harassment (CSAM) from their platforms,” the statement stated.

In a similar vein, the media outlets were counseled to expedite the process of expunging content related to child sexual exploitation (CSAM) from their platforms and to investigate joint ventures with the government to guarantee prompt information exchange regarding CSAM content on the internet.

In addition, the NHRC has requested that the relevant Union, State, and UT administration authorities put the advisory’s recommendations into practice and submit an Action Taken Report (ATR) to the Commission for review within a period of two months.

Leave a Reply

Your email address will not be published. Required fields are marked *