The recently passed Digital Personal Data Protection Act, 2023, has set the cat among the pigeons in India. The Act has received strong opposition on the grounds of violating the right to privacy, creating an opaque, toothless Data Protection Board, and concentrating overweening powers in the hands of the executive through the dubious amendment to the Right to Information, among other concerns on transparency. The Internet Freedom Foundation and the National Commission on People’s Right to Information have been at the forefront of analysing these concerns in great depth.
However, there is another aspect, anchoring the very basis of the Act, which has not received the kind of attention it should -– which is the use of the idea of ‘consent’ as a smokescreen. ‘Consent’ sounds like an altruistic term, and one associates it with transparency, accountability and openness. It is indeed the theory of consent that enabled John Locke to revolutionise liberalism by arguing that governments exist at the mercy of the governed. That perception is what the government seems to be counting on by eulogising the current Act as a consent-based law that respects the privacy of citizens. After all, what could be wrong when citizens are willingly giving consent to the State to take a particular action? Surely, that is the very basis of democratic functioning and the right principle to base a data protection law on?
Alas, in the case of data and safeguarding the right to privacy, that is not how things work. The biggest example of how misleading consent can be comes, interestingly, in the form of OpenAI CEO Sam Altman’s latest offering – WorldCoin. WorldCoin aims to create a “globally inclusive identity and financial network, owned by the majority of humanity,” according to a white paper describing the project. In other words, it is a digital identifier for each person on the planet, which they are terming a World ID, tied to a cryptocurrency called WLD. This is the purported solution to authenticating human beings in an age of rapidly evolving AI bots, while also addressing the contingency of a universal basic income in a global economy disrupted by AI. That is, roughly, what they seek to provide through creating a “proof of personhood”.
The concerns stem from the way it operates. WorldCoin creates its unique identifier by scanning a person’s iris using an orb-shaped webcam. The iris images are then mapped to a numerical code unique to each individual that serves as their World ID. While addressing privacy concerns, the WorldCoin spokesperson said that it does not link the World ID to any personal information and deletes iris images after collecting them unless a user opts-in to having theirs stored in encrypted form. While companies and governments can integrate the World ID with their own systems, WorldCoin said it uses a layer of encryption, called “zero-knowledge proof,” that allows third parties to verify a user without revealing any underlying information. Some of its high-profile backers have presented it as an alternative to existing online verification tools like CAPTCHA. The idea of attaching a bespoke cryptocurrency to WorldCoin is essentially to entice users to consent. In many cases, this is consent for something they do not necessarily understand.
Now, here is where the parallels between the framework of India’s data protection law and the WorldCoin strategy become striking. The data law’s “certain legitimate uses” clause states that consent will not be required for ‘legitimate uses’, including: (i) specified purpose for which data has been provided by an individual voluntarily, (ii) provision of benefit or service by the government, (iii) medical emergency, and (iv) employment. In the previous draft (2022), this was called “deemed consent” and included “public interest” as one of the situations in which consent could be deemed (a proviso that has now, thankfully, been omitted). The “benefit or service by the government” clause, too, seems unacceptably empowering, unless the government clearly defines the benefits/services covered under the ambit of this phrase.
Moreover, this functions on a very generous (and for the government, convenient) assumption – that data principals (us) correctly recognise all privacy risks entailed in complicated digital applications. Remember, this is a country where MGNREGS workers are still struggling with mobile applications that register things as elemental as attendance. How is informed consent on this scale reasonable to expect in a society that has miles to go in terms of digital literacy? Consent in this case is thus a false choice. The responsibility has to be on data fiduciaries regardless of the consent level, rather than burdening individuals with the responsibility to learn before consenting how their data is likely to be used afterwards, especially when the government writes for itself unspecified purposes and blanket exemptions.
Last, but not the least, consent is a distraction. In democracies today, focus must be on data collection, just as much as it is on data processing. The State would always want us to focus on the latter, and pretend that the former flows from it. The data protection law, as it stands today, similarly hopes that this aspect is glossed over by the focus on consent. The term “as may be prescribed” is mentioned an astonishing total of 28 times in the law. By providing this ambiguity, the government creates ample space for delegated legislation. Clauses like “certain legitimate uses” can then be misused by the instrumentality of the State. Notably, there is no safeguard against this weaponisation of consent in the bill – no mandate to inform the data principal about the third parties with whom data could be shared, or the duration of storage.
Therefore, before getting carried away and thinking that India’s first data protection law does to data protection what John Locke did to liberalism, let us hold our horses. The law needs safeguards, and we must make sure that these safeguards are clamoured for.
(The writer is a student of political science at Kirori Mal College, Delhi University)