DAWN.COM

Today's Paper | November 18, 2024

Published 10 Mar, 2024 07:25am

RIGHTS: DEEPFAKES AND THE PAKISTANI WOMAN

Laila* first met Adeel* at a fine dining restaurant along the coastline of Karachi city, after eight months of online dating on Instagram. She was then 20, he 21. The two talked for hours as the golden sun set, meeting the horizon of the sea.

Laila described Adeel as the kindest, most caring man she ever met, and wanted to settle down with him as soon as they both reached their mid-twenties and became financially stable. She felt bliss knowing she had found a man who listened to her and was genuinely into her. However, that would change abruptly in the third year of their relationship, in 2021.

“The initial time in any relationship is so flowery, until you start noticing red flags,” she says, adding that she met Adeel when he was studying biosciences and was soon going to graduate. “It didn’t take much time for him to turn into a controlling man, who gaslighted me and emotionally abused me for his own pleasure. Then, all of a sudden, he asked for my nude pictures. I refused immediately and couldn’t process what was actually happening to me.”

After she refused to send nude photos, Adeel messaged her two days later “a series of deepfake and doctored images of mine, along with humiliating slurs like ‘slut’, ‘bitch’ and ‘whore’,” she says. “I was numb and so shocked [I couldn’t] even react. The images looked unbelievably real, even though I knew that they were not me. This all happened because I said no to sending nude pictures to him,” she says.

With the advent of Artificial Intelligence, it’s become even easier to digitally alter images of women and more difficult to determine them as fake. Given the potential of such manipulated images to harass, intimidate, blackmail and cause severe distress, Pakistan’s laws need to catch up with changing dynamics

She says deepfakes are a threat to women’s existence online. It causes an unthinkable amount of emotional abuse for women.

“Whenever such things happen, people point fingers at women first,” she says. “And because I am a Christian, other men think that Christian girls are ‘fast’ and liberal, so we wouldn’t mind any abuse from their side, which is so wrong and hateful.”

Laila did not take any legal action against Adeel due to fear of her family. With no apparent obstacle, her abuser successfully spread her deepfakes on social media platforms and online groups.

THE NEW NORMALISED MISOGYNY

Artificial Intelligence has taken over the world at dizzying speed, allowing rapid computer learning and simulation to replace what earlier involved manual labour. But it has also raised concerns among various stakeholders, especially when it comes to the ethical and security threats posed by deepfakes.

Deepfakes and Non-Consensual Intimate Images (NCII) digitally alter visual, as well as auditory content, to depict people in fake scenes and scenarios. Hence, the line between reality and fabrication becomes blurred. Deepfakes are used to spread false information and are often malicious.

Deepfakes especially impact women whose online safety remains a neglected area in policy-making. Even the most privileged women, such as American pop star Taylor Swift, are not safe from the threat of deepfakes. In addition, this technology has also targeted many female actresses and celebrities across the border, such as Rashmika Mandana, Katrina Kaif, Kajol and Priyanka Chopra.

While these women had authorities and their fans’ support, there is a large number of girls and women victims who are not only silenced, but fear for their lives, blaming themselves for disgracing their family’s honour. There’s little in terms of help available for women in Pakistan.

Laila, now 25, says she was aware about legal helplines available in 2021 but was hesitant to contact them.

“Society needs to understand that it is not easy for every girl or woman to report and proceed with legal action,” she says. “While the onus of the crime itself is put on the survivors, there is also a self-troubling feeling of regret, for not reporting the abuser, and I was deeply burdened by that,” she adds.

In Pakistan, nearly 11,000 complaints were registered with the FIA Cybercrime Wing in 2023. Out of these, close to 1,200 complaints were related to deepfakes and NCII, with women making up the stark majority of the complainants. Only 12 complainants proceeded with an FIR against their alleged abusers. Merely seven out of these 12 FIRs resulted in the convictions of the perpetrators.

MENTAL TRAUMA

She opens up about the psychological toll and trauma she experienced, pointing out that people may not be able to empathise unless they have been themselves subjected to a crime of this sort.

“The shame is not the whole part,” she says. “I was genuinely not in a state to continue my routine life anymore. I didn’t want to believe that it had actually happened to me. I couldn’t come to terms with the fact that I wholeheartedly trusted my abuser. I had never imagined that he would do something like that.

“My privacy and boundaries had been trespassed. I felt severe discomfort and disgust and was violated, as if I didn’t have any power over my body and my mind, and that I was an object with no feelings or emotions.”

She adds how the moment she looked at her pornographic images, still haunts her.

“It ruined my peace of mind, knowing that so many vile men had access to those pictures, which were a source of pleasure for them. I stopped eating, lost weight, would not talk to people and never wanted to form any sort of intimate or romantic relationships. I was a very lively person before that, and it took me a long time to overcome that phase of my life,” she says.

LEGISLATION AND COMPLAINTS

In Pakistan, nearly 11,000 complaints were registered with the Federal Investigation Agency (FIA) Cybercrime Wing in 2023. Out of these, close to 1,200 complaints were related to deepfakes and NCII, with women making up the stark majority of the complainants.

So far this year, FIA says it has received 1,020 complaints of the same nature.

Meanwhile, the Digital Rights Foundation (DRF) — a nonprofit with a focus on protecting women from cyber-harrassment — says it received 155 deepfake cases out of 1,371 complaints made to it in 2023.

Deepfakes have initiated a new silent era of misogyny and prejudice against women in Pakistan, and posed new challenges for the FIA Cybercrime Wing as to how to proceed against them.

According to the FIA handbook: “Statistics indicate that a significant majority of those targeted by these malicious acts are women. Studies suggest that upwards of 80 percent of NCII victims are women, underscoring the gendered nature of these violations and the disproportionate burden they place on women’s autonomy and safety.”

Pakistan’s laws may be able to tackle some issues related to digitally altered images but not all.

Section 21 of the Prevention of Electronic Crimes Act (Peca), for example, addresses offences committed against the dignity and modesty of individuals, with a specific subsection on preventing minors from online abuse.

Subsection 1 “specifically targets individuals who intentionally and publicly exhibit, display or transmit information through an information system with the following intention: superimposing a person’s face on to a sexually explicit image, including an explicit image or video, intimidating an adult with sexually explicit images or cultivating, enticing or inducing an adult to engage in a sexual act.”

“While Section 21 of Peca is relevant regarding NCII, with deepfakes, applicability hinges on the content,” says Minahil Farooq, a legal associate specialising in criminal and digital law.

Under Section 21, the punishment includes imprisonment for up to five years or a fine of nearly five million rupees or both. However, if the survivor or victim is a minor, the imprisonment increases to seven years.

SOCIETAL SHAME AND VICTIM BLAMING

While Pakistani women have increased their online presence and more people are aware about protecting their digital data, there is not much awareness of deepfakes and the threats they pose. Due to patriarchal norms and a lack of digital literacy, victim blaming often takes over the accountability of the perpetrators.

“Pakistani society is patriarchal, it won’t believe the woman, it will believe the external sources,” says Nighat Dad, a lawyer and founder of DRF. She believes this is why women are hesitant to come forward and report.

“Deepfakes support the perpetrators,” adds Dad. “They provide them with more techniques and tactics to make [the images] so sophisticated, that the women, who were not even believed in the era of Photoshopped images, now face an even more difficult time verifying whether an image is true or false.”

In addition to not being believed, women are also questioned on why they share their intimate images with their partners. This always lays the blame on the victims. Unmarried women aren’t the only victims, as there have been instances where husbands have used their wives’ intimate images to threaten them.

However, by and large, the majority of men who create deepfakes to blackmail and humiliate girls and women in the online realm are not related to the victims. Men also create deepfakes to sell on illicit and pornographic sites.

All this has serious consequences on women’s emotional and mental health.

“I think there is hardly any conversation around the mental aspect of it,” says Dad. “Women get so mentally stressed that they say they are going to commit suicide because they don’t see any other way. The shame and stigma that is attached to it is something that we haven’t worked on a lot. If this happens, there is no fault or mistake of victims or survivors.”

CONVICTION RATE AND REMOVAL MECHANISMS

Out of the 1,180 complaints of deepfakes and NCII registered by FIA last year, only 12 complainants proceeded with a First Information Report (FIR) against their alleged abusers. Only seven out of these 12 FIRs resulted in the convictions of the perpetrators.

The FIA Cybercrime Wing says two factors are responsible for this low rate of conviction: a lack of evidence and victims’ compromising with their abusers.

However, the few Pakistani women who gathered the courage and decided to take legal action through the FIA faced significant procedural hurdles.

“FIA requires survivors and victims to report in person,” says Hira Basit, a senior project manager at DRF. “Let’s say there are 15 offices of FIA cybercrime in 15 different cities and the woman doesn’t live in any of them. So, she has to travel all the way to another city to register the case.

“Not every woman has the privilege to travel — there are financial constraints like affording cabs and accommodation. Moreover, they find it difficult to give excuses to their families about their travel. Even if the FIA office is present in their own city, most of them are not permitted to go out of their homes.”

She explains the hurdle in the lack of evidence. “If the woman reaching out to our helpline doesn’t have strong evidence, then we don’t advise them to go to the FIA, because they can reject their case. The third hurdle is if the perpetrator is not in Pakistan, as there are not strong (legal) mechanisms to catch the perpetrators outside the country,” she adds.

When FIA finally catches and identifies the perpetrator, most women complainants don’t proceed with the FIR because their major concern is getting their deepfakes and NCII removed from the perpetrator’s end, so that he doesn’t threaten or blackmail them in the future.

According to DRF, if the perpetrator

asks the survivor for forgiveness and promises not to commit this crime again in front of the FIA’s team, the case is considered “compensated”.

Survivors often don’t opt for the second step of taking the perpetrator to court as most of them don’t want their families to know. They also fear the long hassle and visits to and from the courts, considering the male-centred nature of these places.

Those women who didn’t register their cases with the FIA but asked DRF for help, were also mainly concerned with having their deepfakes removed from adult sites and social media platforms. DRF advises people that, in case their intimate images have been uploaded on a website, they should use stopnccii.org.

It is a website that creates a “digital fingerprint” — called a hash — from a person’s explicit images, so they can be blocked and prevented from being shared. It collaborates with participating companies by detecting and then removing those images from websites.

“While stopncii.org only works for websites, we use escalation channels in the case of deepfakes and intimate images being spread on social media apps such as WhatsApp, Facebook, Instagram, X [formerly Twitter] etc,” says Basit. “Through this we reach out to these social media companies and get those images removed.”

There is a separate procedure for removing explicit images which get uploaded on adult and porn sites.

Despite all these challenges, there have been some survivors who have received justice.

One such success story emerged when a woman in Daska, Punjab, reached out to DRF in 2021, after her inappropriate images were forwarded on WhatsApp groups and shared with her family by a fake Facebook account. DRF took the case to the FIA and the abuser was arrested. Lower and higher courts played their role and prevented the perpetrator from getting the bail.

Laila emphasises how everyone’s emotional capacity is different, which is why one can’t expect every survivor or victim to report.

“When I look back at it now, I realise that I could have reached out for legal help. I did not because of the sudden shock and trauma, along with concerns regarding ‘what will people, especially my family and relatives, say?’

“Of course, my family was not going to kill me,” she accepts. “They might have been angry at me or might have not talked to me for some time, but I am sure they would have supported me. But I simply was not able to think straight at the time.

“Taking legal action or making a complaint might not have resulted in my abuser’s conviction, but it would have still assured me that I stood against him,” she says ruefully now.

The writer is a journalist, and a gender rights and climate justice activist based in Karachi

**Names changed to protect privacy.*

Published in Dawn, EOS, March 10th, 2024

Read Comments

ICC announces Champions Trophy Tour itinerary for Pakistan-hosted tournament Next Story