LAHORE: Pakistan is witnessing a surge in online child sexual abuse material (CSAM) as it has been among the top three countries in the world in recent years regarding the number of reports on such material online that leads to the rise in the number of child abuse cases in the country.
According to the data of the National Center for Missing & Exploited Children (NCMEC) of the US, that collects the reports of CSAM of the American-based companies like Facebook, Instagram, Snapchat and Twitter,Pakistan had a total of 5.4m online child exploitation reports in three years from 2020 till 2022, only following India and the Philippines that had 16m and 7.1m cases, respectively. Pakistan is followed by Indonesia and Bangladesh with the same number of reports 4.7m each.In 2022, Pakistan had over 2m reports, in 2021, its figure was over 2.03 million while in 2020, it was 1.2m.
The NCMEC was able to collect these reports because the US-based companies, including Meta that runs Facebook and Instagram, are legally bound to share reports data with it.These companies have users worldwide and the incidents are reported to the NCMEC from across the world while the reports include geographic indicators related to the upload location of the CSAM.
The NCMEC receives CSAM reports from over 200 electronic service providers while Meta is the most reporting company with over 74 million reports filed, accounting for 90pc of all reports received by the NCMEC. When it comes to social media apps of Meta, Facebook reported five times more than Instagram and 18 times more than WhatsApp. It means that the children using Facebook are most vulnerable to online sexual abuse.
Country stands third with 5.4m child exploitation reports in three years; Facebook reported most number of cases
The NCMEC report has been shared withDawnbySurfshark, a cybersecurity company focused on developing humanised privacy and security solutions.
Social media and the Internet have posed new challenges to the countries like Pakistan where society is mostly semi-literate or illiterate but the people have learnt using technology without enough knowledge how to navigate it while young minds below the age of 18 find it hard to handle the situation if they come across online predators.
The online problematic activity also translates into the rise in child abuse cases. According to Sahil, an NGO working on child protection in Pakistan,a total of 2,227 cases of child abuse were reported in the country in the first six months of 2023. Strangely, the victims included more boys (593) than girls (457) in age bracket 6-15 years.
It’s report of the year 2022 showed the total number of4253cases reported in newspapers. The data shows that more than 12 children were abused per day during the year. It can be called just a tip of the iceberg as most victims still don’t report it to police that leads to the case being reported in the media.
Regarding online CSAM, Asia seems to be leading in the world as in two-thirds of the reports, crime locations are attributed to Asian countries. Most often, CSAM content location is attributed to India which accounts for almost 16pc of CSAM reports, followed by the Philippines and Pakistan.
According to Unicef, the rise in child abuse might be due to the “combined product of many factors, such as poverty, social norms condoning them, lack of decent work opportunities for adults and adolescents, migration and emergencies.
The child exploitation situation in Asia is nothing short of devastating. Until more is done to prevent CSAM from being released online in Asia, these countries must work on ways to detect and investigate CSAM and punish the criminals accountable. ”
This alarming trend is not only a stark reminder of the issue of online child exploitation in Pakistan but also raises questions about protecting the user’s privacy online.
Hyra Basit, the programme manager of Cyber Harassment Helpline of the Digital Rights Foundation, says her organisation is approached by children and teens under the age of 18 or their parents with complaints of sexual harassment.
“In 2023, we had 81 such complaints from the girls and boys from the age of 8 till 17. In 2022, the number of complaints was 99 while the number of total complaints in the last three years was 184.”
She says the complaints include those of intimate image abuse when the perpetrators engage the victims into sexualised talks and somehow get the intimate images that lead to their harassment and blackmail. In some appalling cases, the predators manipulate the children and teens to get pictures of the women of their family like mothers and sisters. Sometimes doctored and edited images are also used to exploit the underage minds and most victims fall into the trap of sexual predators on social media sites like Facebook and Instagram, she says.
About the solution to the reported cases, Hyra says that in such cases, the underage victims get terrified and they don’t share it their issue with anybody.
“If reported to us, we guide them, telling them that it’s a serious crime under Pakistani laws and suggest the legal course that they can take and that our legal team can help them.”
The organisation helps the victims to remove their images from the internet if the sexual predators post them somewhere.
The Digital Rights Foundation helps such victims in three ways—legal guidance, digital security service, including removal of their images from the internet, and psychosocial support, says Hyra.
Regarding the solution to the CSAM, web-scraping experts from Oxylabs, a Lithuania-based web data gathering company, says there’s an AI-powered way to identify CSAM content as data suggests there might be hundreds or thousands of unreported websites with CSAM. If it’s possible to get such estimates, perhaps countries could invest more into detecting CSAM in this privacy-preserving way.
“In the last few years, artificial intelligence has significantly advanced its capability to recognize vast amounts of visual data. Therefore, web-scraping experts from Oxylabs decided to use the technology for their pro-bono initiative in order to find a more effective, proactive approach to tackling CSAM online rather than the manual and reactive one that’s usually been practised by official authorities,” suggests the report bySurfshark.
Regarding the report on CSAM, Lina Survila, the Surfshark spokeswoman says that it aims to shed light on the unimaginable scope of child sexual abuse material worldwide. By analysing Oxylabs’ AI web scraping initiative, we wanted to raise awareness of the fact that a lot of abusive material exists online on public websites but is not detected.
In Pakistan, Prevention of Cyber Crime Act, 2016 deals with online harassment or abuse of minors and has up to seven years of jail and Rs5m fine for the perpetrators. However, it needs proper implementation of the laws pertaining to online abuse of children and a separate body to deal with this social ill to save the children.
Published in Dawn, February 8th, 2024
Dear visitor, the comments section is undergoing an overhaul and will return soon.