Caught in the web: Surveillance, data protection and AI in Pakistan
Some weeks ago, a LinkedIn user shared a video about a man in Lahore whose CNIC had been used to post bail for different people in separate cases. The man told a Vlogger that he had gone to a xerox shop to get his CNIC photocopied. Sometime later, he discovered that his ID was being used by various persons. “I have been tracing where my ID has been used for a year, and moved court too,” he said.
Cases like these are not uncommon in Pakistan. Some days, you receive marketing messages from businesses you have never shopped from; other times, you get calls from criminals pretending to be bank employees, trying to scam you by threatening you with your bank details, which they somehow have access to.
There is an entire website that allows users to enter a phone number and reveal someone’s CNIC details. The website, after reportedly being brought to the attention of Dr Umar Saif — then caretaker minister for Information Technology and Telecom — it was blocked in Pakistan, but it can still be accessed using a VPN. Such is the response of a country whose IT Minister constantly justifies regressive legislation and policies with arguments about “cyber-attacks” and internet safety, while billions are spent on internet monitoring and blocking mechanisms.
Where’s the law?
At times, even the most harmless tweet you post will be reported by the Pakistan Telecommunication Authority (PTA), and you’ll receive a notification from X. However, the authorities appear to neither have the time nor interest in approaching Facebook or tracking down criminals who openly sell Pakistan’s Nadra and SIM data in public groups and pages, many using their real profiles.
What’s even more troubling is that despite widespread data theft and blackmail due to data misuse or alteration, Pakistan does not have a data protection law. Farhatullah Babar, a former PPP senator, told this author in 2019 that when Peca was being passed, they had asked the PML-N to legislate a law on data protection concurrently. “How can a cybercrime law be passed without a data protection bill?” he had said. “The PML-N promised us they would pass the bill soon after Peca.”
The party never did. A data protection bill is crucial, not just for legal framework but for safeguarding an individual’s data. It demands law enforcement to take responsibility, ensuring our private information is never tampered with or leaked.
More than three drafts of the data protection bill have been prepared over the years. Back in 2016, the draft on the IT ministry’s website included private companies but left out government departments. Fast forward to 2022, under the PTI government, the cabinet approved another draft — still excluding government agencies and had vague definitions regarding data holders, processors, and their responsibilities, experts noted.
The lack of concrete progress on the data protection bill — an essential cornerstone of cyber legislation — has raised many eyebrows, especially given a barrage of other overreaching cyber laws. Many have questioned why these drafts repeatedly exclude government entities, even though departments like Nadra hold massive troves of citizen data, which have already been leaked online multiple times.
Their doubts were confirmed when the 2023 draft, during the Pakistan Democratic Movement (PDM) government, covered this gap but cleverly included data localisation clauses. The reason for the constant delays by Pakistan, it turns out, was that the state seeks unfettered access to the data of everyone living in Pakistan, as well as Pakistanis abroad. The social media rules, established in 2020, also called for data localisation. The focus appeared to be more on gaining access to data than on ensuring its protection.
Rights activists have repeatedly raised alarm over the murkiness in these drafts and rules, mirroring the ambiguity found in other cyber legislation. They offer little detail on how implementation will take place. Notably, the last two drafts include clauses requiring data holders and processors to hand over information to the government upon request — without outlining any formal process, such as a warrant or court order.
“Telecom companies operating in Pakistan are running a mass surveillance system which “enables interception of data and records of telecom customers” without any regulatory mechanism or legal procedures, on the orders of the PTA, a July 2 Dawn report stated.
The details came to the fore following a petition on surveillance of citizens, after several audio clips and private conversations of political figures were shared on X. “According to the judgement, authored by Justice Babar Sattar, the court was told that telecom companies had been asked to “finance, import, and install” the Lawful Intercept Management System (LIMS) at a designated place (referred to as ‘surveillance centre’) for the use of designated agencies. The identity of these agencies, however, was not revealed to the court.
It is not just politicians that our state listens to through digital means. The first real debate over privacy and data protection sparked in 2012 when the PTA ordered telecom companies to end late-night call packages and reportedly used transcripts of a private phone conversation between two people as the justification for its decision in court.
The PTA faced severe backlash as experts questioned its authority to intercept private phone calls between ordinary citizens, let alone use them as evidence in court.
Although no political party has introduced legislation that is in line with international best practices, the PML-N increasingly uses arguments co-opted by rights groups and experts. Recent laws, particularly since Peca, have prioritised surveillance over the fundamental digital rights and safety of Pakistani citizens.
While ordinary people around the world are more likely to encounter scams and phishing attempts than sophisticated hacking, these issues can be effectively addressed through legislative, policing, and judicial reforms. But because Pakistan wants unfettered access to surveil its residents, it won’t even pass a bill that ensures protection against scams and crimes.
Even on a purely technical level, the surveillance policies and technologies in place defy basic internet security rules and best practices. The notion that surveillance can operate as it did in the analog era of the ‘90s — where tapping a phone was straightforward — is dangerously outdated. For in the modern internet infrastructure, anything digital is connected to the wider network, and if you poke holes in a department’s network security, you are opening the door for attacks from other hackers for everyone.
AI use by Punjab, Sindh governments
This year, the Sindh government said it would start using AI technology on toll plazas that will capture and verify number plates and faces in real-time. “Officials have said that the system would enable effective monitoring of entry and exit points and ensure security through the integration of advanced technologies such as facial and number plate recognition at 40 toll plazas, including 18 in Karachi. They said that the project also aimed at enhancing security responses and seamlessly integrating with the existing command and control centre at the Central Police Office (CPO),” a report in Dawn from June stated.
In a press release issued in February, the Punjab government also announced that it was using an “Artificial Intelligence based Facial Recognition System [that] automatically captures pictures and compares them with the data compiled including 16 million records and pictures from the driving licences branch, 1.8 million records from the Crime Record Branch, 1.3 million from the Punjab Khidmat Marakaz, and 300,000 records of accused individuals and criminals from Punjab prisons”. Various recent press releases by the Punjab police show they continue to use this technology.
The announcement came months after the Punjab IT Board, along with the police IT wing, said it had created an “AI-powered ’Face Trace System … to track criminals”. “The system is aimed at enhancing accountability, reliability, and efficiency in tracing and apprehending suspects and wanted criminals,” it added
But AI companies using facial recognition, and predictive policing algorithms have been under increasing criticism in the West. From algorithms locking out captains working for an international ride-hailing company in India because the facial recognition was trained on primarily white faces and features, to an algorithm declaring an old man dead, the use of AI technologies, particularly for facial recognition and biometrics, have come under severe criticism. Ample research shows the facial recognition simply does not work properly on black, brown, or non-white faces.
In Western countries, this has resulted in the targeting of oppressed communities, as predictive algorithms used for criminal arrests are often trained on biased historical data — such as anti-black bias within the US policing system. This has led to targeting of black people in the US and UK. In Buenos Aires, clerical errors led to wrongful identification and arrests by facial recognition software; a stark reminder that even the most sophisticated technology is just one human mistake away from ruining someone’s life.
“Human rights experts are increasingly questioning whether some of these technologies, notably live facial recognition in public spaces, can ever be deployed in ways that do not violate the right to privacy and other human rights, such as freedom of peaceful assembly,” a Privacy International report stated.
Deploying facial recognition technologies on such a large scale without a comprehensive data protection law — one that clearly defines how this data is captured, verified, and used in policing and judicial processes — poses a grave threat to fundamental human rights. Government departments in Pakistan routinely face cyber attacks and hacking, despite the PTA’s claims of increased cybersecurity. In May earlier this year, the Islamabad Safe City Authority’s online system was knocked down by hacker(s) leading to a shutdown. Such data, if accessed by international hackers, can and has been used for stealing financial information and robbing people of their money.
There have also been reports of intimate images of individuals in their cars being leaked, allegedly from the Safe City project. Others have received traffic challans based on faulty data from CCTV and similar technologies installed at traffic checkpoints.
A week ago, an AI regulation bill was introduced in the Senate, but like other cyber laws, it seems more focused on controlling the public use of AI tools, rather than addressing how the state itself employs this technology. According to an analysis of the bill by the Digital Rights Foundation, it fails to distinguish between user-centric AI models, such as ChatGPT, and large-scale systems used by governments in Punjab and Sindh.
Going back to basics
Technology and social media have empowered individuals to earn a living and champion free speech. However, they have simultaneously amplified hate speech, revenge porn, and disinformation. While the spotlight often falls on the more visible consequences — like AI chatbots mimicking human speech or manipulated videos and audio — the deeper issue lies in the structural and political challenges that underpin human rights violations in the digital realm. These systemic problems are often overlooked in favour of sensationalised, surface-level concerns.
Laws targeting the “output” of technology or social media may be within reach for Western democracies, which already have functioning accountability systems. However, countries like Pakistan must address the underlying, on-the-ground issues that enable cybercrime in the first place.
Before introducing any new legislation, Pakistan must first enact a data protection law that aligns with international standards, striking a balance between an individual’s right over their data and the ease of doing business. According to various privacy-related advocacy groups around the world, the root of most cyber crimes, threats, and blackmail ultimately traces back to the exploitation of personal data. There can be no advocacy against surveillance in the digital age without advocating for a data protection law. But not the kind proposed by the PML-N government, which focuses on forcing organisations to store Pakistanis’ data locally.
“Government hacking can be far more privacy intrusive than any other surveillance technique, permitting to remotely and secretly access personal devices and the data stored on them as well as to conduct novel forms of real-time surveillance, for example, by turning on microphones, cameras, or GPS-based locator technology. Hacking also allows governments to manipulate data on devices, including corrupting, planting or deleting data, or recovering data that has been deleted, all while erasing any trace of the intrusion,” states the 2022 Privacy International’s response to a UN High Commissioner report on human rights.
Pakistan needs a law that first defines personal data in line with international best practices such as the EU’s General Data Protection Regulation (GDPR). Such a bill should guarantee individuals the right to their own data and transparency over how private companies, marketing agencies, and government departments use it. It must also protect users from the unauthorised sale of metadata and sensitive digital details, such as IP addresses, phone numbers, and banking information, without explicit consent.
One of the most critical rights is the right to be forgotten. Under the GDPR, individuals can request their data be deleted by data holders, like telecom companies, after discontinuing services or under specific conditions. This isn’t just a matter of user privacy — it helps companies manage storage capacities and prevents the exploitation of personal information. Without such protections, businesses could profit from selling user data to third parties at steep prices. Indeed, there have been reports of dozens of lists of phone numbers and data of subscribers allegedly leaked by employees of telcos in Pakistan.
Like free speech, data privacy also has limitations. The state might want to use this data for nabbing criminals and other security-related reasons. However, any country with a functioning democracy cannot allow its state or law enforcement unfettered access without a bill that ensures a person’s basic human rights. Passing cybercrime and social media legislation without a data protection law is similar to passing policing bills without giving a person the right to a just trial.
The 2023 draft largely addressed these concerns, but then included clauses allowing the state to demand data from any private organisation without a warrant or court order, and asking international and local companies to store all data that deals with Pakistanis within the country, defeating the entire purpose and concept of data privacy and security.
Across the globe, the concept of “techno-solutionism” is facing growing scrutiny. This ideology proposes technology as a quick fix for complex governance challenges, only to introduce further complications when these tech solutions backfire — often due to their failure to scale effectively. For Pakistan, it is time to go back to the basics: focus on democracy, accountability, and fundamental rights of its citizens, then craft regulations that prioritise individual privacy, not mass surveillance of the populace.
Header Image: This is an AI image generated via Shutterstock
Dear visitor, the comments section is undergoing an overhaul and will return soon.