KARACHI: Social media giant Facebook has announced new measures — in partnership with experts from five countries, including Pakistan — to block users from uploading or sharing non-consensual intimate images.
“When someone’s intimate images are shared without their permission it can be devastating. We want to do more to help them,” a Facebook representative told Dawn.
According to a recent report released by the Digital Rights Foundation (DRF), Facebook and WhatsApp had the worst track record when it came to cases of online harassment and misuse of data in Pakistan.
To make reporting more culturally relevant, the social media giant seeks partnership with victim-support organisations in Pakistan
To help victims respond when this abuse occurs, Facebook, the official said, after consultations with victim-support organisations and experts had decided to improve their tools of reviewing and sharing non-consensual intimate images.
Reporting violation
First, the platform has announced a new detection technology that will detect near-nude images or videos that are shared without permission on Facebook and Instagram.
Currently, users on Facebook and Instagram have to report intimate images themselves.
“This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware [of the fact that] the content has been shared,” said Facebook.
Explaining how the process works, the Facebook representative said a specially-trained member of their team would review the content found by the technology, and if the image or video violated the platform’s community standards, it will remove it.
“In most cases, we will also disable an account for sharing intimate content without permission. We offer an appeals process if someone believes we’ve made a mistake,” the official added.
Facebook has faced harsh criticism for allowing offensive posts to stay up too long, for not removing posts that don’t meet its standards and sometimes for not taking into consideration cultural context of abuse.
Responding to the criticism, the official said: “The technology is used to determine two main things: whether an image or video contains nudity or near nudity and whether the image or video was shared in a vengeful manner. It will recognise language patterns and keywords [in captions] that would indicate whether an image or video was shared without consent.”
In Pakistan’s case, the official claimed, the platform already had added a list of slurs and phrases in Urdu which content reviewers use to bring down any content which violated community standards.
The platform has also introduced an “emergency option” to let users send images and videos they suspect will be shared online to Facebook first.
The initiative will allow users to submit a particular photo on Facebook that they don’t want to be shared without worrying about security issues, following which Facebook would create a digital fingerprint of that image and proactively ensure that it never gets shared on its platform. A digital fingerprint is DNA of the image so that in future they can detect similar images while someone tries to upload them on platform.
Safe space for victims
Along with the new technology tools, Facebook said it would also launch a new hub named ‘Not Without My Consent’, where victims can find organisations and resources to support them, including steps they can take to remove the content from the platform and prevent it from being shared further.
“Over the coming months, we will build a victim support toolkit to provide more meaningful information to victims around the world with locally and culturally relevant support,” the representative said.
In order to make reporting more culturally relevant and the resources relevant to victims, Facebook is partnering with victim-support organisations — the Revenge Porn Helpline (UK), Cyber Civil Rights Initiative (US), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Jfi-yeon (South Korea).
Safety work in Pakistan
Given the “long history” of working with organisations in Pakistan on women and children’s safety, the official said the work had prompted the introduction of new tools that gave Pakistani users more control over their content.
“Last year, we introduced tools that were designed specifically to help keep women safe on our platform, after we heard from them that some women choose not to share profile pictures that include their faces anywhere on the internet because they’re concerned about what may happen to their photos.”
Talking about the importance of the initiatives to ensure online safety of women in Pakistan, DRF founder Nighat Dad said sharing of non-consensual images had serious consequences in the country such as honour killings. “Facebook is very popular in Pakistan but the online space is shrinking for women,” she said.
Published in Dawn, March 23rd, 2019