
Photo by Imkara Visual on Unsplash
After years of delays, Pakistan's Ministry of Information Technology and Telecommunication (MoITT) has shared the new draft of the Personal Data Protection Bill (PDPB) for feedback, promising long-overdue privacy protections and consent requirement rules. But behind the legalese, this bill yet again reveals the state’s intention to gain control over personal data, legitimise unchecked surveillance, and reinforce the government’s growing digital authoritarianism.
The draft legislation titled, Personal Data Protection Act, 2025, introduces new compliance obligations for local and international businesses while keeping state agencies largely exempt. It claims to align with global data protection laws, improving some aspects through key changes while making others even more draconian than before.
This bill is not about protecting personal data – it’s about consolidating state power over data. It mirrors global authoritarian trends, where governments use data protection laws as tools for control over civil liberties.
Protection of (Some) Privacy
The Personal Data Protection Bill 2025 introduces several provisions that ostensibly strengthen data protection standards in Pakistan, though their effectiveness remains contingent on robust enforcement. The legislation's enhanced definition of “sensitive data” to include caste and ethnicity, as well as provisions on safeguards for children's data, including mandatory age verification and parental consent requirements, represent a noteworthy attempt to protect oppressed groups and younger users from exploitation in an increasingly digitised landscape.
The bill's strengthened provisions regarding consent withdrawal merit particular attention. By establishing consent as an ongoing right rather than a singular transaction, the legislation theoretically empowers users with greater control over their personal information. Similarly, the expanded rights to challenge harmful data processing could offer vital protections against discriminatory profiling and data misuse. A critical vulnerability emerges in the practical application of these consent provisions when confronted with the reality of modern digital platforms. When major social media services like Facebook, Instagram, and Twitter present users with a stark 'take-it-or-leave-it' choice regarding their terms of service, the bill's theoretical framework of consent faces a fundamental challenge. The absence of granular consent options, coupled with these platforms' increasingly essential role in social and professional engagement, creates a troubling paradigm where users must choose between digital participation and meaningful data privacy - a choice that undermines the very notion of freely given consent that the legislation claims to protect.
The legislation's enhanced definition of “sensitive data” to include caste and ethnicity, as well as provisions on safeguards for children's data, including mandatory age verification and parental consent requirements, represent a noteworthy attempt to protect oppressed groups and younger users from exploitation in an increasingly digitised landscape.
Furthermore, the legislation overlooks the crucial role that power dynamics play in shaping consent, particularly in contexts such as employment contracts or situations where individuals lack the freedom to refuse or withdraw data processing requests. When consent is tied to economic or professional dependencies, the choice becomes an illusion at best, raising serious concerns about whether such consent can ever be considered truly voluntary under the framework the bill seeks to establish.
The enhanced right to erasure or “right to be forgotten" represents another significant development, with its explicit requirement for third-party data controllers to comply with deletion requests. This provision, coupled with the right to data portability, aims to give individuals greater autonomy over their digital footprint and reduce dependency on specific service providers.
Additionally, the streamlined response time for data access requests suggests a commitment to operational efficiency in data management. However, while these provisions broadly align with international standards, their practical impact remains heavily dependent on the enforcement mechanism's independence and effectiveness – a consideration that takes on particular significance given Pakistan's complex history with data governance and regulatory oversight.
These apparent improvements must be viewed within the broader context of Pakistan's digital governance framework, where the gap between legislative intent and practical implementation often raises concerns about genuine data protection reform.
Structural Flaws and Risks
The stark contrast between Pakistan's data protection ambitions and its track record raises serious concerns. At its core lies National Database and Registration Authority (NADRA)'s extensive biometric database – one of the world's largest centralised systems, containing sensitive data of 220 million citizens. Despite this massive data collection, the new bill conspicuously exempts state agencies’ handling of personal information under the guise of national security (Section 34(2)).
This regulatory gap becomes particularly alarming given Pakistan's history of significant data breaches. The NADRA database itself has suffered multiple security compromises, resulting in citizens' sensitive information being available in plain text on public websites. Similarly, the Federal Bureau of Revenue's 2021 breach compromised millions of taxpayers' records and vast amounts of financial data.
The severity of data breaches extends far beyond abstract privacy concerns, manifesting in tangible, often devastating consequences for affected individuals. A particularly striking example emerged during Pakistan's initial COVID-19 outbreak, where the country's first documented patient became a victim of privacy violations. The non-consensual disclosure of his healthcare information transcended mere medical privacy breaches, leading to harassment in an already volatile environment induced by panic around an upcoming pandemic.
Despite this massive data collection, the new bill conspicuously exempts state agencies’ handling of personal information under the guise of national security.
This case shows the dangerous intersection of privacy violations and public hysteria. The unauthorised dissemination of sensitive health data led to immediate and severe repercussions: the patient's personal information led to doxxing, and mainstream media outlets, while abandoning ethical considerations, actively participated in the harassment by conducting aggressive surveillance outside his residence. This incident not only traumatised the person and his family but also demonstrated how the absence of robust data protection frameworks can transform personal information into instruments of public persecution.
Yet rather than addressing these vulnerabilities, the bill paradoxically exempts government bodies from compliance requirements. The combination of extensive data collection, demonstrated security failures, and lack of oversight creates a perfect storm: state agencies can continue to gather, store, and process personal data with minimal safeguards. Under broad "national security" and "public interest" exemptions, this exemption would continue systematic surveillance and political control – the outcomes that robust data protection legislation should prevent.
It is crucial to note that international data protection frameworks, particularly the European Union's General Data Protection Regulation (GDPR) – widely recognised as the gold standard – mandate equal obligations for both public and private entities. Pakistan's PDPB diverges from these established international standards.
The government's expanding digital control agenda is unfolding before our eyes. The recent Prevention of Electronic Crimes (Amendment) Act, 2025 has broadened state control over civil liberties in digital spaces. This amendment introduces harsh penalties for alleged 'fake news', including three-year imprisonment and fines, passed without meaningful public consultation. This legislation represents yet another step towards blanket criminalisation of online expression, further constricting an already restrictive digital landscape. The government has implemented additional measures to curtail digital freedoms, including:
- An aggressive crackdown on VPNs, limiting safe communications and access to information amidst heavy censorship;
- Implementation of the Lawful Intercept Management System (LIMS), granting telecom service providers to grant authorities real-time access to users’ activities
Within this context, where VPNs are banned, mass surveillance is normalised, and free speech is criminalised, the draft data protection bill's exemption of state authorities from compliance adds to a digital panopticon where rights are suppressed and control is enforced through criminalisation.
This approach will inevitably produce a chilling effect, exacerbated by the bill's cross-border data transfer restrictions, which exert pressure to keep data stored within the country’s borders. Civil society organisations have consistently opposed these attempts of data localisation, warning that localised data storage makes information more accessible to the government – a particular concern for activists, journalists, human rights defenders, political opposition, and dissidents in the current political climate. Technology companies similarly resist these requirements, citing compliance costs and infrastructure demands.
The draft data protection bill's exemption of state authorities from compliance adds to a digital panopticon where rights are suppressed and control is enforced through criminalisation.
In addition, the provisions on data retention and purpose limitation present a troubling contradiction within Pakistan's legal framework. While it ostensibly mandates that personal data should not be retained beyond its intended purpose, this principle stands in direct conflict with existing legislation, notably the Prevention of Electronic Crimes Act (PECA) 2016. PECA's requirement for service providers to retain data for up to a year creates a fundamental regulatory inconsistency that undermines the effectiveness of both laws.
This legislative overlap does more than create mere procedural confusion – it establishes a dangerous regulatory grey area that could enable systematic data abuse. In the context of Pakistan's broader data localisation requirements, this ambiguity opens up ways for potential misuse, particularly by entities seeking to exploit these conflicting obligations for prolonged data retention and processing.
While one might suggest that rigorous independent implementation could safeguard citizens' data, this optimism falters when considering authoritarian tendencies in regulatory oversight. The proposed Personal Data Protection Commission's (PDPC) independence remains questionable, as it operates under federal government control. Despite international standards mandating autonomous oversight bodies, this dependence enables selective enforcement which can potentially target critics and could allow state-aligned institutions to operate with impunity – an idea not unknown to Pakistan’s legislative landscape.
Consequently, the bill lacks robust safeguards against state-driven profiling. In an already heavily securitised digital environment, this raises legitimate concerns about political repression and enhanced surveillance of journalists, activists, and marginalised communities.
A Dangerous Gap
At the Cyber Threat Intelligence Conference in November 2024, the Pakistani government shared its plans for an Artificial Intelligence (AI) Policy, scheduled for introduction in early 2025. While the Ministry of IT positions this policy as a catalyst for digital economic growth with an emphasis on cybersecurity, it reveals concerning gaps in citizen protection.
The current AI regulatory discourse in Pakistan demonstrates a troubling imbalance. It heavily emphasises economic opportunities and cybersecurity for the government and its bodies under the usual broad umbrella of national security. However, it notably lacks comprehensive consideration of user rights, protections, and data security – critical oversights that many provisions of the Personal Data Protection Bill 2025 also fail to address.
Effective legislation must establish explicit exemptions for actions that threaten fundamental rights and risk discriminating against marginalised communities. Legal frameworks should serve to dismantle, not reinforce, existing inequalities.
Section 5 of the European Union's AI Act, which came into effect on February 5, 2025, offers a striking contrast to Pakistan's approach by establishing clear, enforceable boundaries around AI deployment. While Pakistan's framework prioritises economic growth and state security, the EU legislation explicitly prohibits AI applications that threaten fundamental rights and civil liberties.
By banning social scoring systems, emotion recognition in workplace and education spaces, and predictive criminal profiling, the EU legislation acknowledges how AI can enable systematic discrimination and social control. Similarly, its restrictions on biometric categorisation and mass scraping of online facial recognition databases directly address privacy concerns that Pakistan's proposed framework largely overlooks.
These prohibitions reflect a regulatory philosophy that places citizen protection at its core – a significant difference from Pakistan's economic-centric approach. Where the EU Act creates specific, actionable boundaries around AI deployment, Pakistan's proposed framework leaves critical questions of individual rights and algorithmic harm largely unaddressed. This divergence in regulatory priorities raises serious questions about the adequacy of Pakistan's AI governance strategy in protecting its citizens from emerging technological threats.
While the draft data protection legislation does grant individuals the right to challenge automated decision-making and request human intervention, it falls short in crucial areas. The absence of clear safeguards regarding personal data usage in AI systems, automated surveillance, or algorithmic decision-making creates dangerous potential for unregulated AI abuse.
Effective legislation must establish explicit exemptions for actions that threaten fundamental rights and risk discriminating against marginalised communities. Legal frameworks should serve to dismantle, not reinforce, existing inequalities.
By neglecting to regulate how these systems process personal data, Pakistan is enabling the unchecked expansion of powerful, intrusive technologies without accountability measures. Rather than taking advantage of the opportunity to introduce necessary safeguards, this bill has left one of the most critical digital rights challenges of today unaddressed.
Violation of International Data Protection Standards
The bill not only falls short of international best practices but directly violates key data protection principles, including the fundamental principles outlined under GDPR:
- Lawfulness, Fairness & Transparency: Exemptions for government agencies allow unchecked data collection, undermining transparency and fairness;
- Purpose Limitation: Vague national security and public interest exceptions allow data to be repurposed without user consent, violating GDPR’s strict processing limits;
- Data Minimisation: No clear restrictions on excessive data collection, especially for biometric and surveillance data;
- Storage Limitation: No safeguards against indefinite retention of user data by government agencies;
- Integrity & Confidentiality: Weak cybersecurity accountability for the state, leaving biometric and personal data at risk of misuse or leaks;
- Accountability: The PDPC lacks independence, violating GDPR’s requirement for a neutral, independent enforcement body.
Implications on Citizens' Rights
The government's exemption from key compliance obligations in the bill legitimises unrestricted state-led data collection, creating an increasingly hostile digital ecosystem for privacy and dissent. Citizens' online protections are rapidly violated through VPN restrictions, internet traffic monitoring, and growing legal barriers against anonymity. Pakistan's history of digital crackdowns demonstrates the weaponisation of data, from NADRA's leaks of sensitive citizen information to the targeting of journalists and activists through surveillance tools, yet this bill offers no safeguards against such misuse. Furthermore, by failing to introduce protections against AI-driven processing of data, the legislation enables unchecked algorithmic harm that could disproportionately impact marginalised communities, activists, and vulnerable populations, especially as AI becomes increasingly advanced and commercialised with every update.
Pakistan’s civil society, digital rights advocates, and legal experts must continue to push back against these provisions to ensure that data protection means real privacy, not state surveillance in disguise. The urgent need for AI regulation must also be addressed before unchecked, state-driven AI systems entrench discrimination, deepen digital authoritarianism, and erase accountability altogether.
- 154 views
Add new comment