Rehashing GDPR

Published August 7, 2023
The writer is a lawyer.
The writer is a lawyer.

DESPITE numerous attempts to introduce a robust data protection law that values the privacy of individuals and encourages growth, the proposed law on data protection fails to align the expectations of persons providing data and persons processing/ controlling it.

There are many significant issues with the proposed law; for example, the requirement for data localisation, powers of the Commission for Data Protection, access to sensitive personal data to government on the pretext of national security, etc. But another problem is that it is modelled on the European data protection law, GDPR (General Data Protection Regulation).

Pakistan is not the only country that has used GDPR as the blueprint for its data protection laws, but it may well be one among those that will be adversely affected by it especially because of the impact on startups. In the EU, for example, there is research showing how GDPR has severely affected small companies due to surging surveillance and compliance costs and limitations that reduce the scalability of data-driven businesses.

The proposed law also offers rights similar to GDPR — rights to access data, correction, withdrawal of consent, etc. The problem lies in not what rights have been granted but their extent and consequences for non-compliance. The process for compliance is long and burdensome, and makes transactions between data subjects and data controllers unnecessarily expensive.

There are issues with the proposed data protection law.

Take the example of the right to correction. The right empowers a person to whom the data relates (data subject) to require the data processor/ controller to rectify personal data if it’s inaccurate, incomplete, misleading, or outdated. The proposed law does not make any distinction for the type of activity for which the data is required, nor does it specify what is personal data. Though the definition provides for the data to be able to identify a person, it is silent on how much data is needed for identifying the data subject or what is needed for it to meet the threshold to be categorised as personal data. Nor does it answer the question of what is complete data or misleading data. If you see an advertisement for chocolates based on your search from the previous night and you begin a diet the next morning, is it incomplete or misleading data in so far that it does not show your preference reflected through your choices made on a search engine? These questions, though basic, make the cracks in the law obvious.

Another example is processing the data of children. The law requires that data controller/ processor obtain the permission of the parent/ authorised guardian of any person below the age of 18. The debate about age limit aside, how would data controllers be expected to reach out to parents or confirm guardianship? In what cases would the costs outweigh the benefit of paternalistic surveillance? For instance, would it be feasible in the case of a 17-year-old signing up for an Instagram account?

These are only some issues with the structure of the law. The primary issue is that it fails to keep up with developments in the sector. The law should have regulated the mechanism for obtaining consent, given how the concept has evolved in tech space due to the use of dark patterns. Instead, it continues with archaic ways of taking consent and allows data subjects to opt out of rights provided their consent is “free, specific, informed, and unambiguous”. This was always the case under the laws of contract and the proposed law has not added anything new nor has is clarified how consent is to be obtained and considered ‘informed’. Would a lengthy document of legalese text meet the requirement or would a short concise statem­ent every step of the way be sufficient?

Dark patterns are user interfaces that manipulate users into selecting options that the data processor may want them to take, instead of what they would actually want by using heuristics (rule of thumb) or biases of individuals. The example of a dark pattern could be taking consent to track a person by repeatedly showing the same screen to tire a person into giving consent or by pre-selecting the option for a user that the controller would want them to make.

Many such practices have been seen as tricks to manipulate the way consent is obtained, and while this consent may satisfy the provisions of the law, it creates a divergence between actual preferences and the manipulated preferences of users, and initiate a cycle where the more data the controllers collect, the more exploitative they become.

The ethical implications of data exploitation aside, these implications of data protection should also be considered before a proposal, which does not address the concerns of data collection, storage, processing, etc, by companies and imposes a huge cost for needless compliances, becomes an act of parliament.

The writer is a lawyer.

samar.masood2@gmail.com

Published in Dawn, August 7th, 2023

Opinion

Editorial

Military convictions
Updated 22 Dec, 2024

Military convictions

Pakistan’s democracy, still finding its feet, cannot afford such compromises on core democratic values.
Need for talks
22 Dec, 2024

Need for talks

FOR a long time now, the country has been in the grip of relentless political uncertainty, featuring the...
Vulnerable vaccinators
22 Dec, 2024

Vulnerable vaccinators

THE campaign to eradicate polio from Pakistan cannot succeed unless the safety of vaccinators and security personnel...
Strange claim
Updated 21 Dec, 2024

Strange claim

In all likelihood, Pakistan and US will continue to be ‘frenemies'.
Media strangulation
Updated 21 Dec, 2024

Media strangulation

Administration must decide whether it wishes to be remembered as an enabler or an executioner of press freedom.
Israeli rampage
21 Dec, 2024

Israeli rampage

ALONG with the genocide in Gaza, Israel has embarked on a regional rampage, attacking Arab and Muslim states with...