At Risk: Online Child Sexual Abuse Grows in India
'We are producing just as much as we are consuming'
“WhatsApp, Snapchat and Discord is where we had a rush of grooming cases reported to us,” says Siddharth P, co-director of Aarambh India, a non-profit based in Mumbai that facilitates support and rehabilitation for survivors of child sexual abuse. “Clearly, adults are pretending to be underage, and interacting on these platforms which have a largely young population.”
Online grooming is when a child and sometimes even the family is emotionally manipulated into trusting an online persona who intends sexual abuse.
“What we have observed is a very typical sort of grooming process: where initially they are just trying to explore hobbies or confessing to having similar interests. Then you can see an increasing sexualisation: pornography is shared, there is a request made for intimate photos in return..
“But in a couple of cases, the minors go into self-investigative mode, where they have figured out that a person is not just doing this to one girl, but his behaviour is that of a serial stalker.”
In his experience with the Aarambh India hotline and fieldwork across states, Siddharth says that many of the cases involve self-generated child sexual abuse material, or content the victim generated or captured themself, which was then used by somebody else for malicious purposes.
“The content is self generated at the point of production, and then there is a leak.. We all know that the internet is full of scams, but occasionally, we will also fall into a scam.. It’s always a weaker moment in which you fall for that trap.”
He goes on to explain how “during Covid, a lot of kids experienced isolation: they had very high exposure to the internet, but socially they were still isolated. This created these moments of vulnerability.. Even the most empowered child may experience a slip to danger.”
He says there has been an increase in child sexual abuse material posted online from India, “since around 2020, when it started being reported.”
“This is an issue that grows exponentially - suddenly you’ll see a surge,” he explains, adding that other structural changes also contribute to the peddling of CSAM. For instance, “the simple factor of India providing hosting services will cause a rise in the number of such platforms.”
Additionally, following the pandemic “a large amount of the ‘at-risk’ population (children, women) came onto the internet. So when an at-risk population intersects with simple factors of continuous internet, or access to internet in a private space, then also you tend to see an uptick.”
Further, self-identification of the problem is increasing. “Kids understand that there are some activities that are risky, and some that are not.. But in the case of adolescents, online and offline, children do indulge in certain risk-taking behaviours at that stage. This is typically a risk-taking age,” says Sidharth.
In India, with an increased understanding of the internet and its potential dangers, “even the perpetrators have been threading the needle. Now even they know that if they are too overt, they will be deplatformed or reported. There are perpetrators who have possibly also been deplatformed before. They have also learned and evolved.”
As a result he says there has been an increased attempt to shift the conversation to direct messages or DMs. Then the would-be perpetrators will make a further request to shift to Discord or Snapchat. And in cases of cyberbullying and online harassment, he says the perpetrators no longer reveal the names of their victims, since they know that most platforms have a policy of having such posts taken down.
Instead, they reveal other telling details about the person, which makes the victim very easily identifiable. “This sort of dog-whistling also makes it very difficult to argue why this content needs to come down.” It’s often hard for a moderator based elsewhere to consider these distinctions.
Siddharth also stresses that often, even after an FIR has been filed with the police and the investigation is underway, the images are still online and being shared, with the police unequipped to track down and report the content so it may be blocked or taken down.
“These loops are missing sometimes in smaller towns,” and in rural areas. “I don’t think it’s about resources, rather I think it is about awareness. That once you file an FIR, you have to make the platform aware that the FIR is filed.”
Self-generated CSAM makes the problem murkier still. “In many cases, the abuse might not be so black and white,” says Siddharth. “It’s not always that there is a clear perpetrator.”
He takes the example of a case where a website called Voiyr was promising money in exchange for uploading nude selfies. A young boy uploaded his pictures to this site with his real name, and only then realised that it was going to be shared publicly. He was having trouble getting the picture taken down.
Voiyr is a public website, and is still active. “In these types of cases, who really is the perpetrator?” asks Siddharth. “If this boy were to go to his locality thana, who would he complain against? And this website had no information attached to it, it was just like an anonymous entity floating around in cyberspace.”
He elaborates how “Awareness is different for different people. It’s really a vast space. And whenever we approach a group of children, we have to approach it from what their material reality is. And that makes a huge difference: for example, there is a difference between the awareness that boys need and the awareness that girls need. Just because their internet experience is so diverse.”
Janice Verghese, a cyberlawyer and policy specialist, agrees with this view. “We have some beautiful laws in our country, but the people have to know about them to make use of them! To be honest, people are still pretty blank about how the internet works.”
Unlike the popular perception, says Verghese, the perpetrators’ anonymity is not the tallest challenge in tackling child pornography. “Anonymity is definitely a reason - but I don’t think anonymity has always been the biggest problem.”
She takes the example of WhatsApp and other messaging services. “There were around 200-300 groups on WhatsApp, there were people who were sharing content - it was really ridiculous - and with some groups they had a rule that if you want to be a part of the group, you will have to share original content first - kind of to prove that you are serious about CSAM. This is happening on WhatsApp, Telegram, everywhere.”
To join a group on such apps, one has to first get in touch with the admin. Then share content with them, “in order to be vetted by them,” Verghese explains. “To ensure that you are legit, and you enjoy this content, you kind of have to prove that you are one of them. You have to create original content and share it.”
As a result of such gatekeeping, not only is the distribution and consumption of CSAM increasing, it is also generating more such content and paving the way for criminal activity.
Verghese says that currently India is the largest producer and consumer of CSAM - “We are producing just as much as we are consuming.” She says the messaging apps know that some of their policies make it easier to distribute CSAM. “Before we even think of tackling the anonymity problem, they are not even doing anything about the things they know to fix!”
While spreading awareness is the more achievable goal, says Verghese, the more important concern is investigation.
“The minute the law enforcement or any authority is alerted, one has to be very quick with the investigation. Because by the time you contact WhatsApp and get the data, maybe the group is deleted.. which is why time is of the essence. Moreover, the data requisition is not sent in time,” she explains, speaking of her time working the helpline for CyberPeace in the past. “And the data also takes a while to arrive.”
Therefore, she says that digital literacy and getting people up to speed on how the internet works is of paramount importance.
“The POCSO, a beautiful law, has become even more stringent now,” she says, speaking of the amendments made to the Protection of Children from Sexual Offences Act in 2019. She points out that the penalty of 10 years imprisonment has now been extended to 22 years with possible death by hanging.
She asks, “But even if you make the laws stronger, have we seen more convictions?”
We as a society are getting desensitised to the issue of child sexual abuse, says Verghese, online or otherwise. “The first time you see such content it will disgust you. The second or third time you will be less disgusted. Whatever role we are playing, we are also becoming desensitised to this.”
She says a new terminology has been added to the sexuality spectrum: MAPS, or Minor-Attracted Persons, who are sexually attaracted to minors.
While Verghese herself is quite appalled by what she calls the attempted legitimation of paedophilia, the public reaction to this has been mixed. While some are disgusted by the idea, others have gladly embraced the new terminology, saying that finally they do not feel like criminals for having a sexual preference.
Many who identify as MAPS say they do not deserve the shame and scrutiny they get, because although sexually attracted to minors, they at least have never acted on those urges.
Verghese however, is sceptical. “The ‘at least’ argument - only a matter of time before the technicality becomes a widely accepted truth.”
Interpol data reveal that India reported 24 lakh cases of child sexual abuse from 2017–2020.
On the effect of spreading awareness on reducing child sexual abuse, Verghese shares that in the training modules she has carried out on CSAM, the tone has changed from one purely cautionary to the victims, to one that cautions potential criminals as well.
“The training not only teaches how children can be safeguarded against this kind of abuse, but they are also taught about the legal repercussions of engaging with CSAM content at a producer or distributor level.”
In her experience, over 2020 the amount of image-based abuse increased online. About 20 percent of these cases were minors themselves. She speaks of getting frantic phone calls in the middle of the night, from a teenage girl stating that her boyfriend had shared videos of her on multiple platforms - in the images shared, only the young girl was visible.
Verghese is of the opinion that awareness and training on this issue have to constantly be evolving, with the changing landscape of the internet, and of crimes committed through the online space.
“Every day I had to view six to eight pieces of sexual content that was filmed without consent! It does things to you. In constantly watching this content, you do not want to slip to the other side, and become desensitised,” she recalls.
A report by Aarambh India on the ideal usage and treatment of the internet by minors says that “their material realities, the socio-cultural context they live in, their caste-class conditions as well as their internet infrastructure, technology, platforms, laws, policies, online communities, etc. actively determine where the child is located on the risk-danger spectrum.”
Children are encouraged to screen lock their phones for their safety, but surveys found that only 18% of them actively lock their phones, and about 54% of the respondents, adults and children, confessed to having shared their phone passwords with close friends and family.
16% of the 155 respondents from across states (mostly girls) revealed that they had been coerced into sharing their passwords with their fathers or boyfriends.
On CSAM groups online, a report by CyberPeace found that “some of the groups get full the moment they are added on the app.. Interesting to note is that most of these groups have group icons that are obscene and sexual, and don’t just show adults but children in sexually explicit activity directly..
“Many of these groups lay emphasis on the fact that only child pornography videos are to be shared and no other links or any things. The descriptions in group descriptions say that if someone doesn’t send a specified number of videos daily, they will be removed from the group.”
“Child sexual abuse has to become a part of our regular conversations,” says Verghese. “We have to talk about how anyone and everyone can be a perpetrator.
“I’m not saying to plaster the pictures everywhere, but I am saying at least have the conversation. The more you speak about this issue, the less the stigma will be. Once you start talking about it, you will not give a damn about log kya kahenge.. [what will people say..] Because you are rising above these problems.”
Report Child Sexual Abuse or CSA Material Online:
MHA Cybercrime Portal: https://cybercrime.gov.in/Webform/Helpline.aspx
Aarambh India IWF Hotline English: https://report.iwf.org.uk/in
Aarambh India IWF Hotline Hindi: https://report.iwf.org.uk/in_hi