Skip to main content

Letter to Buenos Aires Mayor Horacio Rodríguez Larreta re: facial recognition system and children’s rights

Washington, DC, October 9, 2020

Mayor Horacio Rodríguez Larreta
Buenos Aires — ARGENTINA

Re: Facial recognition system and children’s rights

Dear Mayor Larreta,

I am writing on behalf of Human Rights Watch to share our concern regarding the loading of images and personal information of children into a facial recognition system deployed in some subway stations of the city of Buenos Aires. Our research shows that the source database is used in ways that violate children’s rights to privacy in criminal proceedings and that it contains significant errors, and that the facial recognition technology has a higher risk of false matches for children.

As you know, Argentina’s Justice and Human Rights Ministry maintains a national database of individuals with outstanding arrest warrants for serious crimes, known as the Consulta Nacional de Rebeldías y Capturas (CONARC; National Reference of Fugitives and Arrests).[1] This database, which the ministry makes publicly available online, also includes the private personal information of children.

Analysis by Human Rights Watch reveals that at least 166 children have been listed in CONARC between May 2017 and May 2020.[2] Even children suspected of minor crimes are included. The most common crime that these children are accused of is theft–63 children, or 37.5 percent. Persistent errors and obvious discrepancies in CONARC, which is updated every morning at 7 a.m., indicate that the system lacks basic safeguards to minimize data entry errors, which can have serious consequences for a child’s reputation and safety.

Under international human rights law and standards, states should guarantee that the privacy of any child alleged to have committed a crime is fully respected at all stages of the proceedings.[3] No information should be published that may lead to the identification of the child.[4] On May 17, 2019, the United Nations Special Rapporteur on the right to privacy warned Argentina that its use of CONARC was violating children’s rights.[5] By publicizing the arrest records of children, the government is putting these children at risk of harm.

These harms have been amplified in the City of Buenos Aires. Since April 24, 2019, the city government of Buenos Aires has used the CONARC data—including the data on children—in its facial recognition system, the Sistema de Reconocimiento Facial de Prófugos (SRFP; Facial Recognition System for Fugitives).[6] That technology scrutinizes live video feeds of people catching a subway train, walking through or in the vicinity of a subway station, and identifies possible matches with the identities in CONARC. Because CONARC does not include photos of those charged with a crime, the SRFP pulls reference photos from the country’s population registry.[7]

In an official response to an information request by the Observatorio de Derecho Informático Argentino (Argentine Computer Law Observatory), a civil rights organization, the city of Buenos Aires’s Ministry of Justice and Security denied that the facial recognition system identifies children, as “CONARC does not contain the data of minors.”[8] As we established above, this is incorrect.

We are further concerned by the absence of public or legislative debate around the necessity and proportionality of the facial recognition system, especially considering the technology’s adverse impacts on children.[9] Facial recognition has considerably higher error rates for children, in part because most algorithms have been trained, tested and tuned only on adult faces.[10]

This is the case with the facial recognition technology used by the city of Buenos Aires, which was contracted to Buenos Aires-based company Danaide S.A. and developed by Russian company NtechLab.[11]

In tests conducted in a controlled lab setting, using photos posed under ideal lighting conditions, the US Department of Commerce’s National Institute of Standards and Technology (NIST), which evaluates the accuracy of facial recognition algorithms worldwide, found that the three algorithms that NtechLab submitted for testing produced a higher rate of mistaken matches among children than adults. [12]

Based on NIST’s evaluation results, Human Rights Watch calculates that, in a controlled setting, these algorithms falsely identify a child aged 10 to 16 six times more often than an adult aged 24 to 40. The younger the children are, the more pronounced the errors.

Since children experience rapid and drastic changes in their facial features as they age, facial recognition algorithms also often fail to identify a child who is a year or two older than in a reference photo. Because the facial recognition system matches live video with identity card photos collected by the country’s population registry, which are not guaranteed to be recent, it may be making comparisons with outdated images of children, further increasing the error rate.[13]

According to an official response to an information request by the Asociación por los Derechos Civiles (Civil Rights Association), another Argentinian organization, prior to its deployment the SRFP was only tested on the adult faces of employees from the city’s police department and the Justice and Security Ministry.[14] The city government did not request that Danaide S.A., its vendor, perform tests or checks to minimize bias against children.[15] Danaide S.A. and NtechLab did not respond to multiple requests for comment.

Moreover, error rates substantially increase when facial recognition is deployed in public spaces where the images captured on video surveillance cameras are natural, blurred, and unposed. Deploying this technology in the city of Buenos Aires’s subway system, with a daily ridership of over 1.4 million people and countless more passing through or near its stations, will result in people being wrongly flagged as suspects for arrest.[16] Buenos Aires is currently deploying the system on a small scale, for budgetary reasons. The European Union has estimated that when using live facial recognition in places visited by millions of people, like subway systems, even a relatively small error rate like 0.01 percent may result in hundreds of people being incorrectly identified once the system is scaled up to its full capacity. In public statements, the Buenos Aires government has spoken of error rates of 3 percent or higher.[17]

When using a new technology such as live facial recognition, it is necessary to assess the risks of wrong identification and the affected fundamental rights. For example, the Berlin police carried out a large facial recognition trial at a train station in 2017 and 2018, which reviewed live video from CCTV cameras and identified people against a watchlist. On average, the tests had a false positive identification rate of 0.34 percent; this meant that, out of every 1,000 people crossing the surveillance system, between three and four people would be wrongly identified as matches by the system. The German authorities deemed this unacceptable; given “the number of people crossing train stations every day, this would lead to a large number of people incorrectly stopped (or at least flagged to the police).”[18]

Official documents obtained by the Observatorio de Derecho Informático Argentino (Argentine Computer Law Observatory), reveal that the city of Buenos Aires police are stopping and detaining people solely on the basis of the automated alerts generated by the SRFP.[19] Adults have been mistakenly detained and arrested.[20]

Human Rights Watch has written a letter to President Alberto Fernández to request that the Ministry of Justice and Human Rights immediately remove all children (all individuals under 18) from the publicly available version of CONARC.

In turn, we respectfully urge the City Government of Buenos Aires to immediately suspend the SRFP, conduct a privacy and human rights impact assessment, and publish verifiable statistics on the system’s performance to date. The City Government should also invite meaningful engagement with civil society—in particular those with expertise on the right to privacy, the societal and ethical implications of this technology, and children’s rights—in assessing the necessity, proportionality, and legality of the use of live facial recognition surveillance, with special consideration given to the implications for the rights of children.

I take this opportunity to express to you my highest consideration and esteem.

José Miguel Vivanco
Human Rights Watch

[1] Resolución 1068 - E/2016, approved November 10, 2016, http://www.vocesporlajusticia.gob.ar/wp-content/uploads/2016/11/res10682016mj.pdf (accessed May 22, 2020).

[2] Human Rights Watch reviewed 28 versions of CONARC, archived by the Internet Wayback Machine and spanning a three-year period between May 2017 and May 2020.

[3] Convention on the Rights of the Child, September 2, 1990, ratified by Argentina December 4, 1990, art. 16 & 40(2)(vii); Inter-American Court of Human Rights, Juvenile Re‐education Institute Case, Preliminary Objections, Merits, Reparations, Costs, Judgment of September 2, 2004, (Ser. C) No. 112 (2004), para. 211; Inter-American Court of Human Rights, Juridical Condition and Human Rights of the Child. Advisory Opinion OC‐17/02 of August 28, 2002, (Ser. A) No. 17 (2002), para. 134; Inter-American Court of Human Rights, Written and oral interventions related to Advisory Opinion OC‐17/02. In I/A Court H.R., Juridical Condition and Human Rights of the Child, Advisory Opinion OC‐17/02 of August 28, 2002, (Ser. A) No. 17 (2002), p. 25; Secretaría de Jurisprudencia de la Corte Suprema de Justicia de la Nación: Interés superior del niño, 2013, https://sj.csjn.gov.ar/sj/suplementos.do?method=ver&data=intsupn (accessed May 22, 2020).

[4] United Nations Standard Minimum Rules for the Administration of Juvenile Justice (“The Beijing Rules”), adopted by General Assembly resolution 40/33, November 29, 1985, https://www.ohchr.org/Documents/ProfessionalInterest/beijingrules.pdf (accessed May 22, 2020), art. 8.

[5] Statement to the media by the United Nations Special Rapporteur on the right to privacy, on the conclusion of his official visit to Argentina, May 6-17 2019, https://www.ohchr.org/en/NewsEvents/Pages/DisplayNews.aspx?NewsID=24639&LangID=E (accessed May 22, 2020).

[6] Resolución 398/MJYSGC/19, approved April 24, 2019, https://documentosboletinoficial.buenosaires.gob.ar/publico/ck_PE-RES-MJYSGC-MJYSGC-398-19-5604.pdf (accessed May 27, 2020).

[7] Separata del Boletín Oficial de la Ciudad de Buenos Aires, ANEXO - RESOLUCIÓN N° 398/MJYSGC/19, April 25, 2019, https://documentosboletinoficial.buenosaires.gob.ar/publico/PE-RES-MJYSGC-MJYSGC-398-19-ANX.pdf (accessed June 2, 2020).

[8] Director General Fornos Carlos Tristan, Ministry of Justice and Security, Government of Buenos Aires, response NO-2019-33745359-GCABA-DGEYTI to the request for information made by Observatorio de Derecho Informático Argentino, ODIA, October 30, 2019, https://drive.google.com/file/d/1i4zBxv4ahP_xeRKwUQI4VOx7wpFQOVVc/view (accessed May 27, 2020).

[9] The municipal government accelerated passage of the legislation that enabled the adoption of the SRFP, passing it as a resolution rather than a law and bypassing public debate. See: Resolución 398/MJYSGC/19, approved April 24, 2019, https://documentosboletinoficial.buenosaires.gob.ar/publico/ck_PE-RES-MJYSGC-MJYSGC-398-19-5604.pdf (accessed June 12, 2020) and Dave Gershgorn, “The U.S. Fears Live Facial Recognition. In Buenos Aires, It’s a Fact of Life,” OneZero, March 4, 2020, https://onezero.medium.com/the-u-s-fears-live-facial-recognition-in-buenos-aires-its-a-fact-of-life-52019eff454d (accessed June 2, 2020).

[10] Patrick Grother, Mei Ngan and Kayee Hanoaka, “National Institute of Standards and Technology NISTIR 8280: Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects,” December 2019, https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf, (accessed May 22, 2020).

[11] The Government of Buenos Aires awarded a direct contract to Buenos Aires-based company Danaide S.A., which developed Buenos Aires’ facial recognition system based on its software product, Ultra IP. The facial recognition component of this product is reported as having been developed by Russian company NtechLab. See: the direct procurement document by the City of Buenos Aires, Purchase No. 2900-0472-CDI19: https://www.buenosairescompras.gob.ar/PLIEGO/VistaPreviaPliegoCiudadano.aspx?qs=BQoBkoMoEhyvzUss83|5qmQHYdlWCoEzPIKU0JAvRZ7kltC74K|7Tw11ctBR9dfFZZZemaLoi969Lwy2BFPNowVGFQ7XOHCTEKW51rAObRlXsdfYAs0SFw== (accessed June 2, 2020); Director General Fornos Carlos Tristan, Ministry of Justice and Security, Government of Buenos Aires, response NO-2019-21065074-GCABA-DGAYCSE to the request for information from Asociación por los Derechos Civiles, July 2, 2019, https://adc.org.ar/wp-content/uploads/2019/07/Respuesta-PAIP-reconocimiento-facial-GCBA-V2.pdf (accessed May 22, 2020); NtechLab has publicly acknowledged its partnership with Danaide S.A. by listing Ultra IP on its partners list published on NTechLab’s website; see: https://web.archive.org/web/20200511205745/https:/findface.pro/partners/ (accessed June 2, 2020); and Dave Gershgorn, “The U.S. Fears Live Facial Recognition. In Buenos Aires, It’s a Fact of Life,” OneZero, March 4, 2020, https://onezero.medium.com/the-u-s-fears-live-facial-recognition-in-buenos-aires-its-a-fact-of-life-52019eff454d (accessed June 2, 2020).

[12] National Institute of Standards and Technology, “NISTIR 8280: Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects, Annex 10: Cross age false match rates with visa photos,” December 12, 2019, https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf, (accessed May 22, 2020).

[13] The Documento Nacional de Identidad (DNI, or the national identity document) is created at birth. It is legally mandatory to be updated twice: between the ages of 5 and 8 years old, and again when a person is 14 years old. See: Government of Argentina, Ministry of Interior, “DNI for Argentine Residents in the Country,” https://www.argentina.gob.ar/interior/dni/argentinos-residentes-en-el-pais (accessed May 22, 2020).

[14] Director General Fornos Carlos Tristan, Ministry of Justice and Security, Government of Buenos Aires, response NO-2019-21065074-GCABA-DGAYCSE to the request for information from Asociación por los Derechos Civiles, July 2, 2019, https://adc.org.ar/wp-content/uploads/2019/07/Respuesta-PAIP-reconocimiento-facial-GCBA-V2.pdf (accessed May 22, 2020).

[15] Government of Buenos Aires, “Pliego de Bases y Condiciones Particulares, Contratación Directade Un Servicio de Análisis Integral de Video,” March 4, 2019, accessed through the direct procurement document page, under Cláusulas particulares, No. PLIEG-2019-10400885- -SSGA, available at: https://www.buenosairescompras.gob.ar/PLIEGO/VistaPreviaPliegoCiudadano.aspx?qs=BQoBkoMoEhyvzUss83|5qmQHYdlWCoEzPIKU0JAvRZ7kltC74K|7Tw11ctBR9dfFZZZemaLoi969Lwy2BFPNowVGFQ7XOHCTEKW51rAObRlXsdfYAs0SFw== (accessed May 22, 2020).

[16] Government of Buenos Aires, “La Red de Expresos Regionales conectará los ferrocarriles urbanos debajo del Obelisco,” July 6, 2015, https://www.buenosaires.gob.ar/noticias/red-de-expresos-regionales-en-detalle (accessed May 8, 2020).

[17] The SRFP is currently deployed for simultaneous use on 300 CCTV cameras at a time, though the total number of cameras in the wider CCTV surveillance network, the Red Integral de Monitoreo, has been reported to be almost 13,000. The European Union has calculated estimates for uses of this technology in similar contexts; “when applying the technology in places visited by millions of people–such as train stations or airports–a relatively small proportion of errors (e.g. 0.01 percent) still means that hundreds of people are wrongly flagged.” See “Facial Recognition Technology: Fundamental Rights Implications In the Context of Law Enforcement,” European Union Agency for Fundamental Rights, November 27, 2019, https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf (accessed May 8, 2020), pp. 12, 22. See also “Cámaras de reconocimiento facial: Larreta prometió 10.000 más,” Página 12, October 4, 2019, https://www.pagina12.com.ar/223372-camaras-de-reconocimiento-facial-larreta-prometio-10-000-mas (accessed May 8, 2020); Alejandra Hayon, “Seis días arrestado por un error del sistema de reconocimiento facial,” Página 12, August 3, 2019, https://www.pagina12.com.ar/209910-seis-dias-arrestado-por-un-error-del-sistema-de-reconocimien (accessed May 8, 2020); Director General Fornos Carlos Tristan, Ministry of Justice and Security, Government of Buenos Aires, response NO-2019-21065074-GCABA-DGAYCSE to the request for information from Asociación por los Derechos Civiles, July 2, 2019, https://adc.org.ar/wp-content/uploads/2019/07/Respuesta-PAIP-reconocimiento-facial-GCBA-V2.pdf (accessed May 22, 2020).

[18] “Facial Recognition Technology: Fundamental Rights Implications In the Context of Law Enforcement,” European Union Agency for Fundamental Rights, November 27, 2019, https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf (accessed May 8, 2020), pp. 12, 22.

[19] Director General Fornos Carlos Tristan, Ministry of Justice and Security, Government of Buenos Aires, response NO-2019-33745359-GCABA-DGEYTI to the request for information made by the Observatorio de Derecho Informático Argentino (ODIA), October 30, 2019, https://drive.google.com/file/d/1i4zBxv4ahP_xeRKwUQI4VOx7wpFQOVVc/view (accessed May 27, 2020).

[20] See: Alejandra Hayon, “Seis días arrestado por un error del sistema de reconocimiento facial,” Página 12, August 3 2019, https://www.pagina12.com.ar/209910-seis-dias-arrestado-por-un-error-del-sistema-de-reconocimien (accessed May 8, 2020); “Cámaras de reconocimiento facial: Larreta prometió 10.000 más,” Página 12, October 4, 2019, https://www.pagina12.com.ar/223372-camaras-de-reconocimiento-facial-larreta-prometio-10-000-mas (accessed May 8, 2020); “De un DNI mal cargado a una cara parecida: las víctimas del sistema de reconocimiento facial en Buenos Aires,” TN , July 31, 2019, https://tn.com.ar/policiales/de-un-dni-mal-cargado-una-cara-parecida-las-victimas-del-sistema-de-reconocimiento-facial-en-buenos_980528 (accessed May 8, 2020).

Your tax deductible gift can help stop human rights violations and save lives around the world.

Region / Country