ADVERTISEMENT
First, prisoners; then, all of usThe UP government has partnered with a private surveillance start-up, Staqu, to analyse CCTV footage from 70 prisons across UP
Aditya Ranjan
Jai Vipra
Last Updated IST

In August, the Union government withdrew the Personal Data Protection Bill from Parliament. The government said it intends to bring a new draft legislation in its place for comprehensive regulation of the digital economy. There has already been an inordinate delay in establishing a legal framework for the protection of personal and non-personal data. In the absence of such a legal framework, the unregulated collection and use of personal data by government and private parties threatens the rights of all, particularly the marginalised and vulnerable sections of the country.

Prison inmates are one such vulnerable community. In a shocking development, the UP government has partnered with a private surveillance start-up, Staqu, to analyse CCTV footage from 70 prisons across UP with the help of an artificial intelligence (AI) tool. The partnership began by trying to analyse live footage from the prisons to detect instances of unauthorised access, frisking, etc. However, Staqu is now leveraging this access to prisons to train their AI to detect human interactions, such as fights. Post this testing in prisons, the start-up, in partnership with the UP government, intends to implement this technology in the wider state policing system. Staqu is also considering the possibility of developing profiles of the prison inmates along with their voice-prints for continuous surveillance of ex-offenders even after they leave prison. Such use of technology raises several alarms regarding prisoners’ rights and public surveillance.

Also Read: Data protection: Better bill needed

ADVERTISEMENT

While in the past, the judiciary has been in support of the installation of cameras in prisons to prevent human rights abuses, the introduction of invasive surveillance through audio and video recording and harvesting this data to train AI fails the principles of proportionality and necessity laid down in the Justice Puttaswamy judgement.

Prison inmates are often unaware of the deployment of these technologies. They, by default, cannot consent to such measures given that the surveillance is collective, and they cannot leave the area under surveillance. Testing invasive surveillance technology on them disregards their autonomy and right to privacy. Further, studies have shown that the deployment of surveillance technology such as cameras and microphones adversely affect the life of prison inmates and their social interactions as they often fear that their actions, such as laughing or making friends, can be linked to gang behaviour.

While the state police laws empower police officers to monitor those who are “determined to lead a criminal life”, such monitoring cannot legally be extended to all ex-offenders. As held by the Supreme Court in Govind vs State of Madhya Pradesh, surveillance of all ex-offenders, unless there is a grave threat to society, is unconstitutional and void. In this light, a step to continuously monitor ex-prison inmates through facial recognition technology and voice-prints is a violation of their right to privacy.

Importantly, Staqu is a private entity operating in a legal vacuum with respect to data protection. There is no guarantee that the data and software developed using this data will not be transferred to third parties, including private security providers who can restrict ex-prisoners’ access to buildings, credit rating agencies that can restrict their access to finance, employers who can block access to jobs, and so on.

The use of AI technology for surveillance, particularly facial recognition technology, has multiple intersecting problems. It is often wildly inaccurate – which means that innocent people can be caught up in the crosshairs of the criminal justice system due to a technical error. Moreover, the surveillance technology is applied unevenly according to existing State biases, including through the uneven placement of CCTV cameras across a city. It renders day-to-day activities, such as loitering, highly visible to police authorities, increasing the number of atrocities committed against the poor. It has already been used for the profiling of peaceful and lawful protesters in order to harass them. All these factors together mean that AI-based surveillance is a counterproductive measure for maintaining law and peace.

The entire proposed system in question is one where poor and marginalised prisoners’ data is forcibly used to create technology to pry on the entire population. India’s prison population is disproportionately composed of the poor, Scheduled Castes, Scheduled Tribes, Muslims and Sikhs. Further, a majority of prisoners are undertrials. When such a surveillance architecture is built on data illegally harvested from a captive population like prisoners, it violates several constitutional and human rights. These developments in prisons must be resisted by all people, and the new data protection legislation must account for the protection of the rights of prisoners.

(Aditya Ranjan works at the JALDI initiative, Vidhi Centre for Legal Policy. Jai Vipra works at the Centre for Applied Law and Technology Research, Vidhi Centre for Legal Policy)

ADVERTISEMENT
(Published 13 September 2022, 22:56 IST)