Digital rights activist Nighat Dad part of Facebook's 'supreme court' for content

Published May 7, 2020
In this picture taken on February 7, 2020, human rights award winner and founder of Pakistan's first cyber-harassment helpline, Nighat Dad, poses for pictures at her office in Lahore. — AFP Photo
In this picture taken on February 7, 2020, human rights award winner and founder of Pakistan's first cyber-harassment helpline, Nighat Dad, poses for pictures at her office in Lahore. — AFP Photo

Digital rights activist and human rights lawyer Nighat Dad on Wednesday announced she was one of the 20 members of Facebook's newly announced "supreme court" to oversee decisions regarding content published on the social media network and Instagram – another platform owned by Facebook.

Taking to Twitter, Dad — who founded the Digital Rights Foundation — said: "I joined because I truly believe in the power, influence & independence this board will have on complex decisions that will be essential for safeguarding & promoting human rights, women’s rights and freedom of expression."

She added that she was looking forward to working with her fellow board members who hail from across the world and "to start hearing cases later this year".

Earlier on Wednesday, Facebook had announced the first members of its independent board empowered to make binding decisions about what content should be allowed or removed on the social network and Instagram.

The oversight board is to make final decisions regarding the kinds of posts known to embroil Facebook in controversy about censorship, misinformation or free speech.

Facebook public policy director Brent Harris described creation of the board as the "beginning of a fundamental change in the way some of the most difficult content decisions on Facebook will be made".

The 20 announced members of the panel come from various countries and include jurists, human rights activists, journalists, a Nobel peace laureate and a former Danish prime minister.

"This is a group that has a diverse set of insights, backgrounds, and beliefs but share a deep commitment to advancing human rights and freedom of expression," board director Thomas Hughes said during a phone briefing.

The board is to be expanded to 40 members. It remained unclear when the board would start hearing cases due to restrictions on gathering or traveling caused by the deadly coronavirus pandemic.

Board members have met virtually and training has started, according to Hughes.

The board was first proposed by Facebook co-founder and chief Mark Zuckerberg in 2018, and the California-based internet giant has set up a foundation to fund it operating as an independent entity, Harris said.

"As the world lives through a global health crisis, social media has become a lifeline for helping people and communities to stay connected," the board said in a blog post.

"At the same time, we know that social media can spread speech that is hateful, harmful and deceitful. In recent years, the question of what content should stay up or come down, and who should decide this, has become increasingly urgent for society."

Hughes said he was open to the board serving as an arbiter of disputes for other social media firms such as Twitter but that, for now, the focus is on filling its roster and getting into action on cases about Facebook or Instagram posts.

Not the 'internet police'

Former Danish prime minister Helle Thorning-Schmidt is one of the chairs of Facebook's independent oversight panel sometimes referred to as a "supreme court" for difficult content decisions. — AFP/File
Former Danish prime minister Helle Thorning-Schmidt is one of the chairs of Facebook's independent oversight panel sometimes referred to as a "supreme court" for difficult content decisions. — AFP/File

Facebook will implement the board's decisions, unless they violate law, and "respond" to guidance on policies, according to Harris.

The board said it will decide whether disputed posts comply with Facebook and Instagram policies and "values" as well as freedom of expression within the framework of international norms of human rights regardless of the social network's corporate interests. The board will make decisions public and report on how well Facebook obeys rulings.

Reputational costs

Zuckerberg has personally assured the board the social network will abide by its decisions, according to co-chair Helle Thorning-Schmidt, a former prime minister of Denmark.

"This board is not designed to be an echo chamber," said co-chair Catalina Botero-Marino of the Universidad de los Andes Faculty of Law in Colombia.

"Facebook would have a very high reputational cost if it doesn't carry out decisions by a body it created to resolve its thorniest problems."

Facebook cannot remove members or staff of the board, which is supported by a $130 million irrevocable trust fund.

"For the first time, an independent body will make final and binding decisions on what stays up and what is removed," Thorning-Schmidt said. "This is a big deal; we are basically building a new model for platform governance."

Facebook CEO Mark Zuckerberg outlined his idea in 2018 for a "supreme court" that would be able to consider the difficult decisions on what to allow and remove on the leading social network. — AFP/File
Facebook CEO Mark Zuckerberg outlined his idea in 2018 for a "supreme court" that would be able to consider the difficult decisions on what to allow and remove on the leading social network. — AFP/File

Board co-chair Michael McConnell, a university law professor and former US federal judge, said the expected volume of cases would make it impossible to consider them all.

Instead, like the US Supreme Court, the board will prioritise content removal cases that can set precedents for how Facebook should handle similar material, according to McConnell.

"We are going to have to select maybe a few flowers, or maybe they are weeds, from a field of possibilities," McConnell said.

The board plans to first focus on cases affecting large numbers of users; second on cases that have major effect on public discourse, and then those that affect policy at the platform, he explained.

"We are not the internet police," McConnell said. "Don't think of us as a fast action team that is going to swoop in. Our job is to consider appeals, provide an after-the-fact, deliberative second look."

Opinion

Editorial

Counterterrorism plan
Updated 23 Nov, 2024

Counterterrorism plan

Lacunae in our counterterrorism efforts need to be plugged quickly.
Bullish stock market
23 Nov, 2024

Bullish stock market

NORMALLY, stock markets rise gradually. In recent months, however, Pakistan’s stock market has soared to one ...
Political misstep
23 Nov, 2024

Political misstep

FORMER first lady Bushra Bibi’s video address to PTI followers has triggered a firestorm. Her assertion implying...
Kurram atrocity
Updated 22 Nov, 2024

Kurram atrocity

It would be a monumental mistake for the state to continue ignoring the violence in Kurram.
Persistent grip
22 Nov, 2024

Persistent grip

An audit of polio funds at federal and provincial levels is sorely needed, with obstacles hindering eradication efforts targeted.
Green transport
22 Nov, 2024

Green transport

THE government has taken a commendable step by announcing a New Energy Vehicle policy aiming to ensure that by 2030,...