How do we feel about conspiracy theories? As a nation, we seem quite fond of them. They have rescued many a weary family gathering by dividing drawing rooms into passionate camps. Remember the COVID-era favourite? The ‘Amreekan’ government injecting Pakistanis with microchips under the guise of vaccines.
Ready for a new one?
Our legislature recently passed the Digital Nation Act 2025. Most commentators view it through the crypto lens: legalising trading and exchanges, perhaps issuing a digital currency, possibly even holding crypto reserves to hedge the dollar. Fair enough; these possibilities feel necessary given the economic chaos of the last half-decade. But the Act itself is far more ambitious. It aims to ‘leverage the transformative power of digital technologies, responsible use of data… to accelerate sustainable economic development, improve citizen well-being, and modernise governance frameworks.’
Crypto disrupted finance. AI, however, is reshaping everything: decision-making, automation, even human interaction. If the Act is serious about digital transformation, then AI, not merely crypto, must be central. To drive this, the Act establishes three institutions: the National Digital Commission for strategy, the Pakistan Digital Authority for execution, and an Oversight Committee for accountability. It also leans heavily on ‘data’: a National Data Strategy, data governance, data exchange layers. The message is clear: data will be regulated to realise the Act’s lofty goals. And here comes the trouble.
Despite all the talk of data, Pakistan still lacks a comprehensive data protection law. The Pakistan Data Protection Bill 2020 came and went. The 2023 Bill remains stalled. The 2020 draft was more rights-centric, the 2023 more state-centric. Both drew inspiration from Europe’s GDPR, often called the global gold standard. Yet, without enactment, Pakistan relies on patchwork: the Electronic Crimes Act 2016, Telecom Consumer Protection Regulations 2009, and the Payments Act 2007. We are all familiar with data privacy scandals: Cambridge Analytica, Meta, Google’s settlement in the United States. Closer to home, NADRA’s leaks between 2019 and 2023. These underscore the importance of treating personal data like the digital equivalent of the home: inviolable, private, integral to one’s dignity.
But here is the paradox: AI thrives on data. Big, messy, diverse datasets fuel learning and refinement. The larger the dataset, the better the model. Training modern AI has essentially meant scraping oceans of the internet; text, images, social media posts, usually without individual consent. Regulate too early and one throttles the very thing one hopes to grow.
Consider the global chronology. In the United States and China, large-scale AI development surged before serious privacy laws came into force. Europe, meanwhile, embraced GDPR in 2018 and championed consent, erasure, portability rights. The GDPR approach was principled, yes, but heavy-handed. Since then, the United States has attracted billions in AI investment and trained multiple foundational models such as GPT, PaLM, and Gemini. China, too, forged ahead with its own. Europe? It has world-class researchers and papers, but far fewer foundational models to show. The link is hard to miss: regulate early, innovate late.
This is not an argument against privacy. It is an argument about timing. Overregulation, especially before building robust digital infrastructure, can stifle experimentation and tilt the playing field towards incumbents with lawyers and cloud budgets. Pakistan’s digital ecosystem is fragile. Registries are patchy, datasets uneven, APIs scarce. Heavy data laws here would impose compliance burdens without delivering the promised benefits. The likely outcome is predictable: incumbents survive, while startups die before product market fit.
Ronald Coase, the economist, helps clarify this. His famous theorem argued that with low transaction costs and well-defined rights, markets self-correct without heavy intervention. Overregulation, in contrast, distorts incentives and raises costs disproportionately for small players. Contrast Pakistan with the United States. The US built AI in a light-regulation era, only now tightening the screws. Europe regulated early, and lags. If Pakistan mimics Europe prematurely, it risks bearing the costs without reaping the protections.
So why push Pakistan to regulate now? Perhaps because the global North discovered the sanctity of personal data only after feasting on it for decades. Having mined oceans of unregulated content to build AI, they now export glossy white papers urging the developing world to adopt tight regulation, effectively pulling up the ladder. The parallel with environmental regulation is instructive. The West industrialised first, polluting and profiting freely, only later discovering green principles. Now developing countries are told to cap emissions from day one. China’s rebuttal has long been simple: development first, regulation after.
Is this a conspiracy?
Is this the ‘Amreeka inserting microchips’ moment to stall and detract us from leveraging the full potential of digital technology? Not in the shadowy sense, but the optics are familiar. Those who broke the rules are now writing them. We are being told, only now, that data privacy is the mark of a liberal, progressive, democratic society. And at the same time, the world leaders are getting more and more transactional. The US flagged the sale of NVIDIA chips to China as a national security risk, fearing it might boost China’s AI. Yet now, by paying the US a 15% levy on those sales, NVIDIA and AMD are free to conduct business. Now that the world is getting transactional, should we and could we afford to grow a conscience?
None of this is to argue for regulatory nihilism. Privacy matters for autonomy, dignity, and democracy. But Pakistan must be context-sensitive and innovation-aware. The prudent path is sequence, not symmetry. Begin with minimal baseline protections: breach notifications, purpose limitation, security-by-default. Create sandboxes for AI and emerging technologies. Encourage public–private data partnerships that build trust without blocking access. Invest urgently in digitising priority datasets and building consent and exchange layers. Tighten to GDPR-like obligations only after the infrastructure—the “rails”—exists. In short: regulation should follow innovation, not precede it.
And so, we return to conspiracy theories. Imagine this: the real conspiracy is that regulation, dressed as privacy and responsibility, is the global North’s way of ensuring Pakistan never gains its full economic potential. Say that at your next Sunday brunch, and watch the drawing room erupt. For the record: I do not actually believe it is a conspiracy. But if calling it one gets us to think harder about sequencing, innovation, and sovereignty, then perhaps it is worth the drama.
Daraab W. Furqan
The writer is a Lawyer at Crown 1207 LLP.