<p>The recent revelations about NSO group’s Pegasus being used to target an estimated two dozen Indian lawyers and activists using the vulnerabilities in Whatsapp have once again brought the issue of targeted surveillance of citizens into focus. As the saying goes, no good crisis should go to waste. This is an opportunity to raise public awareness about trends in mass surveillance involving facial recognition systems and CCTV cameras that impact every citizen, irrespective of whether or not they have a digital presence today.</p>.<p>The Panoptican, conceptualised by philosopher Jeremy Bentham, was a prison designed in a way that prisoners could be observed by a central tower, except they wouldn’t know when they were being watched, forcing them to self-regulate their behaviour. Michel Foucault later extended this idea stating that modern states could no longer resort to violent and public forms of discipline and needed a more sophisticated form of control using observation and surveillance as a deterrent.</p>.<p>Live facial recognition combined with an ever expanding constellation of CCTV cameras has the potential to make this even more powerful. Therefore, it suits governments around the world, irrespective of ideology, to expand their mass surveillance programs with stated objectives like national security, identification of missing persons etc. and in the worst cases, continue maximizing these capabilities to enable the establishment of an Orwellian state.</p>.<p><strong>Global trends</strong></p>.<p>China’s use of such systems is well documented. As per a study by the Journal of Democracy, there will be almost 626 million CCTV cameras deployed around the country by the end of 2020. It was widely reported in May that its facial recognition database includes nearly all citizens. Facial recognition systems are used in public spaces for purposes ranging from access to services (hotels/flights/public transport etc.) to public shaming of individuals for transgressions such as jaywalking. The latter is done by displaying faces and identification information on large screens installed at various traffic intersections. It is even used to monitor whether students are paying attention in class or not. Such usage was highlighted by an almost comedic case in September, when a young woman found that her access to payment gateways, ability to check in to hotels/trains etc. was affected after she underwent plastic surgery. In addition to all this, there is also a fear that facial recognition technology is being used to surveil and target minorities in the country’s Xinjiang province.</p>.<p>In Russia, Moscow mayor Sergei Sobyanin has claimed that the city had nearly 2,00,000 surveillance cameras. There have also been reports that the city plans to build AI-based facial recognition into this large network with an eye on the growing number of demonstrations against the Putin government.</p>.<p>Even more concerning is the shift by countries that have a ‘democratic ethos’ to deploying and expanding their usage of such systems. Australia was recently in the news for advocating face scans to be able to access adult content. Some schools in the country are also running a trial of the technology to track attendance. France is testing a facial recognition-based National ID system. In the UK, the High Court dismissed an application for judicial review of automated facial recognition. The challenge itself was a response to pilot programs run by the police, or installation of such systems by various councils, as per petitioners, without the consent of citizens and a legal basis. There was also heavy criticism of facial recognition being used at football games and music concerts. Its use in personal spaces, too, continues to expand as companies explore potential uses to measure employee productivity or candidate suitability by analysing facial expressions.</p>.<p>There are opposing currents as well – multiple cities in the US have banned/are contemplating preventing law enforcement/government agencies from deploying the technology. Sweden’s Data Protection Authority fined a municipality after a school conducted a pilot to track attendance on the grounds that it violated EU’s General Data Protection Regulation (GDPR).</p>.<p>Advocacy groups like the Ada Lovelace Institute have called for a moratorium on all use of the technology until society can come to terms with its potential impact. Concerns have been raised on grounds that the accuracy of such systems is currently low, thus severely increasing the risk of misidentification when used by law enforcement agencies. Secondly, since the technology will learn from existing databases (e.g. a criminal database), any bias reflected in such a database such as disproportionate representation of minorities will creep into the system.</p>.<p>Also, there is limited information, in many cases, on where and how such systems are being used. Protestors in Hong Kong and, recently, Chile, have shown the awareness to counter law enforcement’s use of facial recognition by targeting cameras. The means have varied from the use of face-masks/clothing imprinted with multiple faces to pointing numerous lasers at the cameras, and even physically removing visible cameras.</p>.<p><strong>India’s direction</strong></p>.<p>In mid-2019, the National Crime Records Bureau of India put out a tender inviting bids for an Automated Facial Recognition System (AFRS) without any prior public consultation. Meeting minutes of a pre-bid seminar accessed by the Internet Freedom Foundation indicated that there were 80 vendor representatives present.</p>.<p>Convenience is touted as the main benefit of various pilot programs to use ‘faces’ as boarding cards at airports in New Delhi, Bengaluru and Hyderabad as part of the Civil Aviation Ministry’s Digi Yatra program. Officials have sought to allay privacy concerns stating that no information is stored. City police in New Delhi and Chennai have run trials in the past. Hyderabad police has until recently, routinely updated their Twitter accounts with photos of officers scanning people’s faces with cameras. Many of these posts were deleted after independent researcher Srinivas Kodali repeatedly questioned the legality of such actions.</p>.<p>Many of the aforementioned trials reported low single figure accuracy rates for facial recognition. The State of Policing in India (2019) report by Lokniti and Common Cause indicated that roughly 50 per cent of personnel believe that minorities and migrants and ‘very likely’ and ‘somewhat’ naturally prone to committing crimes. These aspects are concerning when considering capability/capacity and potential for misuse of the technology. False-positives as result of a low accuracy rate, combined with potentially biased law enforcement and a lack of transparency, could make it a tool for harassment of citizens.</p>.<p>Schools have attempted to use them to track attendance. Gated communites/offices already deploy a large number of CCTV cameras. A transition to live facial recognition is an obvious next step. However, given that trust in tech companies is at a low, and the existence of facial recognition training datasets such as Megaface (a large dataset utilised to train facial recognition algorithms using images uploaded on the Internet as far back as the mid 2000s without consent) – privacy advocates are concerned.</p>.<p><strong>Opposition and future considerations for society</strong></p>.<p>Necessary and Proportionate, a coalition of civil society organisations, privacy advocates around the world, proposes 13 principles on application of human rights to communication surveillance, many of which are applicable here as well. To state some of them – legality, necessary and legitimate aims, proportionality, due process along with judicial and public oversight, prevention of misuse and a right to appeal. Indeed, most opposition from civil society groups and activists against government use of mass surveillance is on the basis of these principles. When looked at from the lenses of intent (stated or otherwise), capacity and potential for misuse – these are valid grounds to question mass surveillance by the governments.</p>.<p>It is also important for society to ask and seek to answer some of the following questions: Is the state the only entity that can misuse this technology? What kind of norms should society work towards when it comes to private surveillance? Is it likely that the state will act to limit its own power especially if there is a propensity to both accept and conduct indiscriminate surveillance of private spaces, as is the case today? What will be the unseen effects of normalising mass public and private surveillance on future generations and how can they be empowered to make a choice?</p>.<p><em>(Prateek Waghre is a researcher at Takshashila Institution)</em></p>.<p><em>(Disclaimer: The views expressed above are the author’s own. They do not necessarily reflect the views of DH.)</em><br /> </p>
<p>The recent revelations about NSO group’s Pegasus being used to target an estimated two dozen Indian lawyers and activists using the vulnerabilities in Whatsapp have once again brought the issue of targeted surveillance of citizens into focus. As the saying goes, no good crisis should go to waste. This is an opportunity to raise public awareness about trends in mass surveillance involving facial recognition systems and CCTV cameras that impact every citizen, irrespective of whether or not they have a digital presence today.</p>.<p>The Panoptican, conceptualised by philosopher Jeremy Bentham, was a prison designed in a way that prisoners could be observed by a central tower, except they wouldn’t know when they were being watched, forcing them to self-regulate their behaviour. Michel Foucault later extended this idea stating that modern states could no longer resort to violent and public forms of discipline and needed a more sophisticated form of control using observation and surveillance as a deterrent.</p>.<p>Live facial recognition combined with an ever expanding constellation of CCTV cameras has the potential to make this even more powerful. Therefore, it suits governments around the world, irrespective of ideology, to expand their mass surveillance programs with stated objectives like national security, identification of missing persons etc. and in the worst cases, continue maximizing these capabilities to enable the establishment of an Orwellian state.</p>.<p><strong>Global trends</strong></p>.<p>China’s use of such systems is well documented. As per a study by the Journal of Democracy, there will be almost 626 million CCTV cameras deployed around the country by the end of 2020. It was widely reported in May that its facial recognition database includes nearly all citizens. Facial recognition systems are used in public spaces for purposes ranging from access to services (hotels/flights/public transport etc.) to public shaming of individuals for transgressions such as jaywalking. The latter is done by displaying faces and identification information on large screens installed at various traffic intersections. It is even used to monitor whether students are paying attention in class or not. Such usage was highlighted by an almost comedic case in September, when a young woman found that her access to payment gateways, ability to check in to hotels/trains etc. was affected after she underwent plastic surgery. In addition to all this, there is also a fear that facial recognition technology is being used to surveil and target minorities in the country’s Xinjiang province.</p>.<p>In Russia, Moscow mayor Sergei Sobyanin has claimed that the city had nearly 2,00,000 surveillance cameras. There have also been reports that the city plans to build AI-based facial recognition into this large network with an eye on the growing number of demonstrations against the Putin government.</p>.<p>Even more concerning is the shift by countries that have a ‘democratic ethos’ to deploying and expanding their usage of such systems. Australia was recently in the news for advocating face scans to be able to access adult content. Some schools in the country are also running a trial of the technology to track attendance. France is testing a facial recognition-based National ID system. In the UK, the High Court dismissed an application for judicial review of automated facial recognition. The challenge itself was a response to pilot programs run by the police, or installation of such systems by various councils, as per petitioners, without the consent of citizens and a legal basis. There was also heavy criticism of facial recognition being used at football games and music concerts. Its use in personal spaces, too, continues to expand as companies explore potential uses to measure employee productivity or candidate suitability by analysing facial expressions.</p>.<p>There are opposing currents as well – multiple cities in the US have banned/are contemplating preventing law enforcement/government agencies from deploying the technology. Sweden’s Data Protection Authority fined a municipality after a school conducted a pilot to track attendance on the grounds that it violated EU’s General Data Protection Regulation (GDPR).</p>.<p>Advocacy groups like the Ada Lovelace Institute have called for a moratorium on all use of the technology until society can come to terms with its potential impact. Concerns have been raised on grounds that the accuracy of such systems is currently low, thus severely increasing the risk of misidentification when used by law enforcement agencies. Secondly, since the technology will learn from existing databases (e.g. a criminal database), any bias reflected in such a database such as disproportionate representation of minorities will creep into the system.</p>.<p>Also, there is limited information, in many cases, on where and how such systems are being used. Protestors in Hong Kong and, recently, Chile, have shown the awareness to counter law enforcement’s use of facial recognition by targeting cameras. The means have varied from the use of face-masks/clothing imprinted with multiple faces to pointing numerous lasers at the cameras, and even physically removing visible cameras.</p>.<p><strong>India’s direction</strong></p>.<p>In mid-2019, the National Crime Records Bureau of India put out a tender inviting bids for an Automated Facial Recognition System (AFRS) without any prior public consultation. Meeting minutes of a pre-bid seminar accessed by the Internet Freedom Foundation indicated that there were 80 vendor representatives present.</p>.<p>Convenience is touted as the main benefit of various pilot programs to use ‘faces’ as boarding cards at airports in New Delhi, Bengaluru and Hyderabad as part of the Civil Aviation Ministry’s Digi Yatra program. Officials have sought to allay privacy concerns stating that no information is stored. City police in New Delhi and Chennai have run trials in the past. Hyderabad police has until recently, routinely updated their Twitter accounts with photos of officers scanning people’s faces with cameras. Many of these posts were deleted after independent researcher Srinivas Kodali repeatedly questioned the legality of such actions.</p>.<p>Many of the aforementioned trials reported low single figure accuracy rates for facial recognition. The State of Policing in India (2019) report by Lokniti and Common Cause indicated that roughly 50 per cent of personnel believe that minorities and migrants and ‘very likely’ and ‘somewhat’ naturally prone to committing crimes. These aspects are concerning when considering capability/capacity and potential for misuse of the technology. False-positives as result of a low accuracy rate, combined with potentially biased law enforcement and a lack of transparency, could make it a tool for harassment of citizens.</p>.<p>Schools have attempted to use them to track attendance. Gated communites/offices already deploy a large number of CCTV cameras. A transition to live facial recognition is an obvious next step. However, given that trust in tech companies is at a low, and the existence of facial recognition training datasets such as Megaface (a large dataset utilised to train facial recognition algorithms using images uploaded on the Internet as far back as the mid 2000s without consent) – privacy advocates are concerned.</p>.<p><strong>Opposition and future considerations for society</strong></p>.<p>Necessary and Proportionate, a coalition of civil society organisations, privacy advocates around the world, proposes 13 principles on application of human rights to communication surveillance, many of which are applicable here as well. To state some of them – legality, necessary and legitimate aims, proportionality, due process along with judicial and public oversight, prevention of misuse and a right to appeal. Indeed, most opposition from civil society groups and activists against government use of mass surveillance is on the basis of these principles. When looked at from the lenses of intent (stated or otherwise), capacity and potential for misuse – these are valid grounds to question mass surveillance by the governments.</p>.<p>It is also important for society to ask and seek to answer some of the following questions: Is the state the only entity that can misuse this technology? What kind of norms should society work towards when it comes to private surveillance? Is it likely that the state will act to limit its own power especially if there is a propensity to both accept and conduct indiscriminate surveillance of private spaces, as is the case today? What will be the unseen effects of normalising mass public and private surveillance on future generations and how can they be empowered to make a choice?</p>.<p><em>(Prateek Waghre is a researcher at Takshashila Institution)</em></p>.<p><em>(Disclaimer: The views expressed above are the author’s own. They do not necessarily reflect the views of DH.)</em><br /> </p>