Abstracts
Résumé
Que ce soit par des organismes privés ou publics, l’adoption des technologies de reconnaissance faciale est sujet de controverses, en particulier en raison du manque de lois spécifiques réglementant leur usage. Malgré les questionnements et débats qu’engendrent l’utilisation de l’intelligence artificielle (IA) et la prise de décision automatique dans les pays comme le Canada et les États-Unis, des services de police, entre autres, reconnaissent avoir eu recours à de telles technologies dans divers contextes. La littérature existante montre que les défis qu’apporte l’adoption des technologies ayant recours aux algorithmes de reconnaissance faciale vont des biais liés, notamment, aux erreurs d’identification et d’appariement aux questions d’éthique et de valeurs, en passant par des enjeux environnementaux. Alors qu’une prolifération de ces systèmes de haute technologie permettant l’identification biométrique automatique est de plus en plus remarquable, les outils juridiques et politiques en la matière tardent à s’adapter à un domaine en constante évolution. Cette note fait un tour d’horizon des politiques, des instruments juridiques et des cadres légaux en place au Canada et aux États-Unis vis-à-vis de l’adoption et de l’utilisation des technologies de reconnaissance faciale. En particulier, nous montrons que bien que cette adoption soit chose courante à travers différentes entités, les contextes politiques et les cadres légaux et réglementaires ne permettent pas de faire face à un système d’identification biométrique par l’IA. Les préoccupations en matière de droits de la personne, de vie privée et d’équité invitent à un examen plus approfondi de l’état des instruments en place.
Mots-clés :
- reconnaissance faciale,
- algorithmes,
- politiques,
- réglementation,
- Canada,
- États-Unis
Abstract
Whether they are used by private or public entities, facial recognition technologies are subject to debate and controversy, notably around the lack of specific legislation on their use. Although artificial intelligence (AI) and automated decision-making systems have raised questions and sparked debate in Canada and the U.S., police departments have used such technologies in various contexts. Literature demonstrates the challenges brought by facial recognition algorithms range from biases related to errors of identification and matching to challenges of values and ethics, including environmental issues. Despite an increasing trend in the use of leading-edge automated biometric identification technologies, existing legal and policy instruments have not kept pace with the highly evolving field of high-tech. This note provides an overview of the policies and legal frameworks in Canada and in the U.S. with respect to the acquisition, use and integration of facial recognition technologies. In particular, we show that despite an increasing integration of such technologies across various entities, the policy, legal and regulatory frameworks are yet to be adapted to the new reality of AI biometric identification. Issues related to human rights, privacy and equity call for further examination of the instruments in place.
Keywords:
- facial recognition,
- algorithms,
- policies,
- regulation,
- Canada,
- United States
Appendices
Bibliographie
- Booker, C. A. (2019, 23 octobre). S.2689 –No Biometric Barriers to Housing Act of 2019. https://www.congress.gov/bill/116th-congress/senate-bill/2689/text.
- Brayne, S. (2020). Predict and surveil : Data, discretion, and the future of policing. Oxford University Press.
- Buckley, B. et Hunter, M. (2011). Say cheese ! Privacy and facial recognition. Computer Law & Security Review, 27(6), 637-640. https://doi.org/10.1016/j.clsr.2011.09.011.
- Buolamwini, J. et Gebru, T. (2018). Gender shades : Intersectional accuracy disparities in commercial gender classification. Proceedings of the 1st Conference on fairness, accountability and transparency. PMLR, 81, 77-91. http://proceedings.mlr.press/v81/buolamwini18a.html.
- Commissariat à la protection de la vie privée du Canada. (2013). Reconnaissance faciale automatisée dans les secteurs public et privé. https://www.priv.gc.ca/media/1766/fr_201303_f.pdf.
- Commissariat à la protection de la vie privée du Canada. (2020, 6 juillet). Clearview AI cesse d’offrir sa technologie de reconnaissance faciale au Canada. https://www.priv.gc.ca/fr/nouvelles-du-commissariat/nouvelles-et-annonces/2020/nr-c_200706/.
- Commissariat à la protection de la vie privée du Canada. (2020, 8 octobre). Rapport annuel au Parlement 2019-2020 concernant la Loi sur la protection des renseignements personnels et la Loi sur la protection des renseignements personnels et les documents électroniques. https://www.priv.gc.ca/fr/mesures-et-decisions-prises-par-le-commissariat/ar_index/201920/ar_201920/.
- Commissariat à la protection de la vie privée du Canada. (2020, 12 novembre). Un cadre réglementaire pour l’IA : recommandations pour la réforme de la LPRPDE. https://www.priv.gc.ca/fr/a-propos-du-commissariat/ce-que-nous-faisons/consultations/consultations-terminees/consultation-ai/reg-fw_202011/.
- Commission de l’éthique en science et en technologie. (2020). Les enjeux éthiques soulevés par la reconnaissance faciale (8e Commission jeunesse). https://www.ethique.gouv.qc.ca/media/2wqngchp/cest-j_2020_reconnaissance_faciale_acc_web.pdf.
- Coons, C. A. et Lee, M. (2019, 14 novembre). S.2878 –Facial Recognition Technology Warrant Act of 2019. https://www.congress.gov/bill/116th-congress/senate-bill/2878/text.
- Curtis, J. R. (2021, 19 mars). H.R.2075 –Foreign Advanced Technology Surveillance Accountability Act. https://www.congress.gov/bill/117th-congress/house-bill/2075/text?r=8&s=1.
- duPont, S. (2021, 14 avril). Facial recognition is here but we have no laws. Nextgov.com. Retrieved July 6, 2022, from https://www.nextgov.com/ideas/2020/07/facial-recognition-here-we-have-no-laws/166711/.
- Eubanks, V. (2018). Automating inequality : How high-tech tools profile, police, and punish the poor. St. Martins Press.
- Garvie, C., Bedoya, A. M. et Frankle, J. (2016, 18 octobre). The perpetual line-up : Unregulated police face recognition in America. Georgetown Law Center on Privacy & Technology. https://www.perpetuallineup.org/.
- Garvie, C. (2019, 16 mai). Garbage in, garbage out : Face recognition on flawed data. Georgetown Law Center on Privacy & Technology. https://www.flawedfacedata.com/.
- Gilligan, M. (2020, 29 octobre). Cadillac Fairview covertly collected images of 5M shoppers across Canada : Privacy commissioners. Global News. https://globalnews.ca/news/7429905/cadillac-fairview-facial-recognition-investgation-findings/.
- Grother, P., Ngan, M. et Hanaoka, K. (2019a). Face recognition vendor test (FRVT). Part 3 : Demographic effects. National Institute of Standards and Technology, U.S. Department of Commerce. https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf.
- Grother, P., Ngan, M. et Hanaoka, K. (2019b). Face recognition vendor test (FRVT). Part 2 : Identification. National Institute of Standards and Technology, U.S. Department of Commerce. https://doi.org/10.6028/NIST.IR.8271.
- Hill, K. (2020, 24 juin). Wrongfully accused by an algorithm. The New York Times. https://www.nytimes.com/2020/06/24/technology/facial-recognition-arrest.html.
- Huizenga, B. (2022). H.R.6609 –Facial Authorization Cannot be Enforced Act or the FACE Act. https://www.congress.gov/bill/117th-congress/house-bill/6609/actions?r=5&s=1.
- Information Commissioner’s Office. (2017). In the picture : A data protection code of practice for surveillance cameras and personal information. https://www.enlutc.co.uk/wp-content/uploads/2019/08/cctv-code-of-practice.pdf.
- Kantayya, S. (2020). Coded bias.An exploration of the fallout of MIT Media Lab researcher Joy Buolamwini’s startling discovery of racial bias in facial recognition algorithms. 7th Empire Media. https://www.grandintheatre.com/shows/coded-bias.
- LaFrance, A. (2017, 24 mars). Databases of facial images proliferate. The Atlantic. https://www.theatlantic.com/technology/archive/2017/03/who-owns-your-face/520731/.
- Le Page, M. (2018, 10 octobre). AI’s dirty secret : Energy-guzzling machines may fuel global warming. New Scientist. https://www.newscientist.com/article/mg24031992-100-ais-dirty-secret-energy-guzzling-machines-may-fuel-global-warming/.
- Leroux, R. (2020, 21 janvier). Vos données personnelles, une cause perdue ? Protégez-vous. https://www.protegez-vous.ca/technologie/donnees-personnelles#submenu-item-436656.
- Li, D., Chen, X., Becchi, M. et Zong, Z. (2016). Evaluating the energy efficiency of deep convolutional neural networks on CPUs and GPUs. Institute of Electrical and Electronics Engineers. https://doi.org/10.1109/BDCloud-SocialCom-SustainCom.2016.76.
- Lu, D. (2019, 6 juin). Creating an AI can be five times worse for the planet than a car. New Scientist. https://www.newscientist.com/article/2205779-creating-an-ai-can-be-five-times-worse-for-the-planet-than-a-car/.
- Markey, E. J. (2021, 15 juin). S.2052 –Facial Recognition and Biometric Technology Moratorium Act of 2021. https://www.newscientist.com/article/2205779-creating-an-ai-can-be-five-times-worse-for-the-planet-than-a-car/.
- Martin, N. (2019, 25 septembre). The major concerns around facial recognition technology. Forbes. https://www.forbes.com/sites/nicolemartin1/2019/09/25/the-major-concerns-around-facial-recognition-technology/?sh=129082354fe3.
- Merkley, J. et Booker, C. A. (2020, 12 février). S.3284 –Ethical Use of Facial Recognition Act. https://www.congress.gov/bill/116th-congress/senate-bill/3284/text.
- Ministère de la Justice du Canada. (2019, 21 juin). Loi sur la protection des renseignements personnels et les documents électroniques (dernière mise à jour). https://laws-lois.justice.gc.ca/PDF/P-8.6.pdf.
- Molnar, P. et Gill, L. (2018). Bots at the gate : A human rights analysis of automated decision-making in Canada’s immigration and refugees system. The Citizen Lab, University of Toronto. https://citizenlab.ca/wp-content/uploads/2018/09/IHRP-Automated-Systems-Report-Web-V2.pdf.
- Nzobonimpa, S. (2022). L’utilisation des technologies d’apprentissage automatique par la police préoccupe-t-elle les intervenants québécois ? Analyse d’une récente consultation publique. Criminologie, 55(1), 271–310. https://doi.org/10.7202/1089737ar.
- O’Neil, C. (2016). Weapons of math destruction : How big data increases inequality and threatens democracy. Broadway Books.
- Parizeau, M.-H. (2010). Identité, empreinte génétique et citoyenneté : réflexions philosophiques. Sociologie et sociétés, 42(2), 207-229. https://doi.org/10.7202/045362ar.
- Raji, I. et Buolamwini, J. (2019). Actionable auditing : Investigating the impact of publicly naming biased performance results of commercial AI products. Conference on Artificial Intelligence, Ethics, and Society. https://dam-prod.media.mit.edu/x/2019/01/24/AIES-19_paper_223.pdf.
- Secrétariat du Conseil du Trésor du Canada (2019, 5 février). Directive sur la prise de décisions automatisée. Gouvernement du Canada. https://www.tbs-sct.gc.ca/pol/doc-fra.aspx?id=32592.
- Secrétariat du Conseil du Trésor du Canada (2020, 18 juin). Politique sur la protection de la vie privée. Gouvernement du Canada. https://www.tbs-sct.gc.ca/pol/doc-fra.aspx?id=12510.
- Senior, A. W., & Pankanti, S. (2011). Privacy protection and face recognition. Handbook of Face Recognition, 671–691. https://doi.org/10.1007/978-0-85729-932-1_27.
- Smith, A. (2019, 5 septembre). More than half of U.S. adults trust law enforcement to use facial recognition responsibly. Pew Research Center. https://www.pewresearch.org/internet/2019/09/05/more-than-half-of-u-s-adults-trust-law-enforcement-to-use-facial-recognition-responsibly/.
- Statista Research Department (2022, 17 mars). Opinions about top countries in race for AI among business leaders in the U.S. 2019. https://www.statista.com/statistics/1030087/united-states-opinion-leading-countries-in-artificial-intelligence/.
- Strubell, E. G., Ganesh, A. et McCallum, A. (2019). Energy and policy considerations for deep learning in NLP. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. https://doi.org/10.18653/v1/p19-1355.
- Tunney, C. (2020, 5 mars). RCMP denied using facial recognition technology - then said it had been using it for months. CBC News. https://www.cbc.ca/news/politics/clearview-ai-rcmp-facial-recognition-1.5482266.