The Justified Notoriety of Facial Recognition Systems

Share

Share

The use of technologies allowing facial recognition might be regarded as the next logical step in systems where it promises efficiency. Imagine having seamless airport check-ins, reliable voter verification process1, easier identification of missing children2 and benefit of government schemes reaching to the ones in actual need3. Additionally, very recently, facial recognition technologies (“FRT”) have been used to uphold lockdown restrictions4 or provide contactless verification for providing vaccines.5 FRTs enjoy more appeal than other biometric methods like fingerprint or iris scans, where risk of spreading of virus is much higher.6

However, these benefits must not hide the dangers of this technologies that has called a group of organisations to call for its ban in several countries.7 In India, most FRTs are being deployed by law enforcement agencies.8 Further, the National Crime Records Bureau (“NCRB”) is in the process of creating a National Automated Facial Recognition System (“AFRS”) in order to make a large database of photographs for easier identification of criminals. In the absence of proper safeguards, rampant usage of FRTs raises serious concerns.

  1. Questionable accuracy
    The most common models of FRTs are linked to a database which is required to find match from the photograph(s) that is captured by the FRT.9 For accurate functioning of the FRTs, the data must be comprehensive and representative to prevent exclusions or mismatch. FRTs have worked better for white males in a lot of situations as compared to black females.10 Similarly, authentication issues with identification have led to denial of vaccines for many.11 Over-reliance on FRTs, therefore might adversely impact few more than others.
  2. Compromising Privacy
    The Puttaswamy judgment recognises the ‘reasonable expectation of privacy’ which is contextual in nature.12 Relying on the ‘mosaic theory of privacy’, it is understood that the data aggregation that FRTs do, tend to harm privacy of an individual.13 The very functioning of FRTs is based in collecting vast amounts of data, thereby impacting privacy.Additionally, the judgment does not provide for horizontal application of this right. However, it creates positive obligation on the state to ensure that individuals are able to exercise their right to privacy in a meaningful manner.14 Justice Kaul in the case had emphasised the need to regulate how information is collected by non-state actors.15 Therefore the judgment is significant in nudging the regulating framework in the direction that allows challenge against non-state actors in cases where FRTs are deployed without any check.
  3. Other rights at stake
    FRTs that disproportionately targets members of a particular section of society are instrumental in inhibiting their exercise of their freedom of speech and movement. Having chilling effect of being watched, the functioning of which one does not fully comprehend encourages individuals to adopt self-censorship.16 This is exactly what FRTs result in.
  4. Catalysing datafication17
    Datafication is the process whereby the different aspects of human life are converted into digital data that can then be used for making better decisions.18 Sophisticated FRTs are able to convert human expression, emotional states, violent tendencies, intelligence etc. into data points.19 FRTs make data the reference point for understanding and navigating social behaviour and consequently impact human actions.20 The asymmetry in the access and the generation of data invites issues of power imbalance. FRTs have been used in political rallies to mark ‘habitual protestors’.21
  5. Tool for mass surveillance
    FRTs in an unregulated environment is a recipe for surveillance from the state. FRTs have been shown to expose political orientation of individuals.22 This could allow government to separate dissenters from supporters in a riot and then crack down selectively. This kind of profiling has been shown to have disproportionate impact on members from the vulnerable communities.Additionally, efficacious surveillance is covert which is exactly why FRTs are gaining prominence as it requires no active involvement from the targets. Further, technically, a successful FRT would necessarily require collection of data from a large number of individuals.23 Therefore, by its very nature the technology does not adhere to the principles of proportionality or consent.

    The Personal Data Protection Bill, 2019 classified “biometric data” including facial images as “sensitive personal data” and therefore has more safeguards in terms of processing. However, until the law is not brought into force, there are no fetters on the use or misuse of FRTs.

    It is easy to fall into the pitfalls of technological determinism when it offers cost-effective solutions and the consequences of it are subtle and not well known. This is all the more true in case of FRTs. Although there are calls for complete ban on the technology as it is argued that it cannot be regulated in a manner that accommodates sufficient safeguards. However, to start with, it is pragmatic to have a comprehensive consultative process before FRTs are deployed by government authorities.

 

  1.  Telangana State Election Commission, (Jan. 18, 2020), https://tsec.gov.in/pdf/ULBS_MPLTS/circulars/2020/Cir_No_111_TSEC-ULBs_2020_dated_18.01.2020_1401.pdf
  2. App to track, reunite missing children, The Times of India (Jun. 30, 2018, 07:35 IST), https://timesofindia.indiatimes.com/city/delhi/app-to-track-reunite-missing-children/articleshow/64799962.cms
  3. Sharad Vyas, Sena to serve up Shiv bhojan from Republic Day, The Hindu (Jan. 19, 2020, 01:30 IST), https://www.thehindu.com/news/cities/mumbai/sena-to-serve-up-shiv-bhojan-from-republic-day/article30597193.ece
  4. Priya Dialani, Covid-19 Pandemics is Encouraging Facial Recognition Technology, Analytics Insight (Jan. 6, 2021), https://www.analyticsinsight.net/covid-19-pandemic-is-encouraging-facial-recognition-technology/
  5. India: Vaccine exclusion fears over digital ID, face recognition, AlJazeera (Apr. 16, 2021), https://www.aljazeera.com/news/2021/4/16/india-vaccine-exclusion-fears-over-digital-id-face-recognition
    Smriti Parsheera, Adoption and regulation of facial recognition technologies in India: Why and Why not?, Data Governance Network (Nov. 2019), https://datagovernance.org/files/research/NIPFP_Smriti_FRT_-_Paper_5.pdf
  6. Daphne Leprince-Ringuet, Facial recognition tech is supporting mass surveillance. It’s time for a ban, say privacy campaigners, ZdNet (Apr. 6, 2021, 13:59 IST), https://www.zdnet.com/article/facial-recognition-tech-is-supporting-mass-surveillance-its-time-for-a-ban-say-privacy-campaigners/; Virginia’s bill to ban facial recognition technology, Security (Apr. 2, 2021), https://www.securitymagazine.com/articles/94946-virginias-bill-to-ban-facial-recognition-technology
  7. Arindrajit Basu & Siddhart Sonkar, Decrypting Automated Facial Recognition Systems (AFRS) and Delineating Related Privacy Concerns, AI Policy Exchange (Dec. 26, 2019), https://aipolicyexchange.org/2019/12/26/decrypting-automated-facial-recognition-systems-afrs-and-delineating-related-privacy-concerns/
  8. Parsheera supra n.6.
  9. Sarah Myers West et. al., Discriminating Systems: Gender, Race, and Power in AI, AI Now Institute (Apr. 2019), https://ainowinstitute.org/discriminatingsystems.pdf
    supra n. 5.
  10. Basu & Sonkar supra n. 8.
  11. Arindrajit Basu & Siddhart Sonkar, Automated Facial Recognition Systems and the Mosaic Theory of Privacy: The Way Forward, AI Policy Exchange (Dec. 30, 2019), https://aipolicyexchange.org/2019/12/30/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward/
  12. Sreekar Aechuri, Horizontal Applicability of the Right to Privacy in India, South Asia Journal (Mar. 19, 2019), https://southasiajournal.net/horizontal-applicability-of-the-right-to-privacy-in-india/#_edn13; The Horizontal Application of the Fundamental Right to Privacy, Tech, Law and Policy Blog (Sep. 5, 2017), https://techlawpolicy.wordpress.com/2017/09/05/the-horizontal-application-of-the-fundamental-right-to-privacy/
  13. Jayna Kothari, The Indian Supreme Court Declares the Constitutional Right to Privacy, Oxford Human Rights Hub (Oct. 4, 2017),https://ohrh.law.ox.ac.uk/the-indian-supreme-court-declares-the-constitutional-right-to-privacy/

Lawyers.

Interns and Paralegals.

Disclaimer.

As per the rules of the Bar Council of India, we are not permitted to solicit work or advertise. By agreeing to access this website, the user acknowledges the following:

This website is meant only for providing information and does not purport to be exhaustive and updated in relation to the information contained herein. Naik Naik & Company will not be liable for any consequence of any action taken by the user relying on material / information provided on this website. Users are advised to seek independent legal counsel before proceeding to act on any information provided herein.