Associated Incidents

It was late afternoon. After leaving work, administrative assistant Davi was waiting for the last ride of the day with a friend at a bus stop on Avenida Paralela, in Salvador, when he was approached by a military police garrison. “I handed over my ID, he took his cell phone and was comparing, looking at my face and looking at the ID,” he recalls. Then, the police informed him that, when passing by the Lapa subway station, the facial recognition cameras found similarities of his face with that of a wanted by justice who has his data registered in the database of the Secretary of Public Security. Once detected by the system, Davi was monitored by 15 stations, from Lapa to Mussurunga, until he was approached by the police. Davi was innocent, but not to the cameras that followed him along the 22 kilometer route. He was released after being identified by police officers, who found that he was not who the cameras and artificial intelligence pointed out. But it entered the statistic as yet another case of a young black man misidentified by facial recognition technologies adopted by the police. The hit rate is small: in the Micareta da Feira de Santana in 2019, for example, only 3.6% of the 903 alerts generated became [arrest warrants](http://www.ssp.ba.gov.br/2019 /04/5613/Facial-Recognition-results-in-33-people-prisons.html). Despite this, the government of Bahia continues to treat facial recognition [as a showcase of its public security policies](https://www.bnews.com.br/noticias/politica/315549,rui-costa-anuncia-investimento-de -r-900-million-in-technology-on-public-security.html). In two and a half years, 215 wanted were captured using technology. In July of this year, Governor Rui Costa, from the PT, decided to [expand the system](http://www.bahia.ba.gov.br/2021/07/area-de-imprensa/rui-assina-ordem- de-servico-para-amplir-recognition-facial-e-de-plates-na-bahia/) and entered into a R$ 665 million partnership with the conglomerate Oi and Avantia, which specializes in security technologies. Thus, in addition to Salvador, 77 other cities [will gain 4,095 connected cameras](http://www.ssp.ba.gov.br/2021/07/10138/Governador-autoriza-expansao-de-tecnologia-a-mais- 77-cidades-baianas.html) in the state. “Before, identification was made by the police, visually. Now, the system itself identifies criminals, suspects, weapons and license plates”, declared the governor, without hiding his enthusiasm for the technology. More: the Bahian government also wants the private sector to integrate the surveillance and facial recognition system. Bank agencies, shopping malls and condominiums, for example, could connect their cameras and deliver the movements and faces of those who pass by to the authorities. “[With this,] we can multiply the eyes of public security in Bahia”, [celebrated Costa](https://bahia.ba/bahia/rui-quer-iniciativa-privada-acessando-sistema-de-vigilancia- to-fight-crime-in-bahia/). The capital, Salvador, was chosen to host the pilot project of the new system, provided by Spain's Iecisa in partnership with Huawei for R$18 million. In Brazil, Huawei was also responsible for providing the technology to Rio de Janeiro. The term of service that closed the details of the system purchased by the Bahian government specifies that it must recognize people even if they have glasses or a beard – even if they do not have facial hair or accessories in previous records. You should also group photos of the same face “for later consultation”. The images must be filed with date, time and location and can be searched by name, dates or even from an image of the face itself. Thus, in a few clicks – at least technically – the security forces of Bahia would be able to know not only who passed through a location, but also where that person has been before that. The system uses artificial intelligence to compare the faces obtained by the cameras with the images available in the wanted database of the Public Security Secretariat, the SSP, fed by the Intelligence Superintendence. “If the person has any restrictions, such as an arrest warrant or missing persons, the tool sends an alert to the Integrated Telecommunications Center, which triggers a police team closer to the location to carry out an approach”, explained the SSP in response to a request made. via the Access to Information Law, the LAI. But in a complimentary report by Fantástico, Bahian authorities claimed to go beyond official databases. “I arrived in Bahia and am I here identified by your system?”, asked the reporter Murilo Salviano, to Colonel Marcos Oliveira. “Absolutely,” he assured her. In the demonstration, Salviano's images are compared on the monitor to photos of him posted on social networks. To the Intercept, the colonel confirmed that the police use “public images from social networks” to investigate a crime. The Security Secretariat stated that the report only talks about the “possibility of using photos extracted from social media, such as Facebook, which has open source”. Through his press office, he assured that the resource “so far” is only used to locate missing persons. When the algorithm identifies 90% similarity of the detected face with a suspect from the database, an alert is issued and there is a “human analysis”, says the SSP. If they confirm the similarity, a garrison that is close to the location is instructed to carry out the approach. On the streets, police officers still use an application called the Mobility System in Police Operations to collect information from citizens. The MOP, as it is known, is installed directly on the agents' cell phones and allows consultation of data on drivers, vehicles, criminal records and police reports. It can even be used on private cell phones. The MOP has access to vehicle data, fingerprints and criminal records, demographics, gender and location data, including those of children and adolescents – if the purpose is “public safety”, says the privacy policy. With a government elected for a discourse strongly linked to the fight against violence, Costa had the technological surveillance devices as one of his campaign promises. Facial recognition cameras, which began to be installed in December 2018, were touted as efficient tools to fight crime. The arrest of an [outlaw criminal](https://noticias.uol.com.br/cotidiano/ultimas-noticias/2019/03/05/cameras-de-reconhecimento-facial-acham-criminoso-no-carnaval-de -salvador.htm) during that year's Carnival helped to boost public opinion in favor of the investment. The Bahian government's new spending on technology for public security exceeds BRL 900 million, which is advertised as the “biggest investment in public security in Bahia's history”. But despite the government's massive announcement, state spending on public security has been falling year on year. Bahia is one of the Brazilian states with the lowest expenditure in this sector per inhabitant: R$ 289, [less than half](https://forumseguranca.org.br/wp-content/uploads/2021/07/anuario-2021- complete-v6-bx.pdf) from, for example, Mato Grosso, Minas Gerais or Tocantins. In addition, data on state violence call into question the government's bet. In 2019 and 2020, Bahia was the federative unit that registered [the highest number of violent deaths](https://g1.globo.com/ba/bahia/noticia/2021/02/12/monitor-da-violencia- bahia-registers-highest-number-of-violent-deaths-for-the-second-consecutive-year.ghtml) in the country. In the first half of 2021, [this index grew](https://g1.globo.com/ba/bahia/noticia/2021/08/20/monitor-da-violencia-assassinatos-aumentam-71percent-no-primeiro- semester-of-2021-na-ba.ghtml) plus 7.1%. As the exact location of facial recognition cameras is not revealed, it is not possible to do crosswalks to see their impact on violence rates. But data from regions where common cameras have been installed may provide some clues. In the Lobato region, with 10 cameras, crimes against property increased by 144% from 2012, the year in which the monitoring system was started in the capital, until 2019. In the Cajazeiras region, with 22 cameras, the increase was 71.3 % and, in the Sussuarana region, with nine cameras, it was 50.8% higher. The reduction actually took place in Barra, an upscale neighborhood with 34 cameras, where the rate dropped 84% in the period. But not in the region of Liberdade and Cidade Nova: despite the 34 cameras installed, the index rose 12.7% in the analyzed period. The Secretary of Public Security did not comment on the numbers. For the government, the use of facial recognition technologies is “healthy for fighting crime” and “an aggregating instrument in the crime prevention mechanism when used in conjunction with efficient policing processes and practices”. ### When the robot makes a mistake In a country whose [criminal selectivity is notorious against black and poor people](https://apublica.org/2019/05/negros-sao-mais-condenados-por-trafico-e-com -menos-drogas-em-sao-paulo/) and in a state in [where 97% of victims of police violence are black](https://atarde.uol.com.br/bahia/noticias/2149421-bahia- 97-of-people-killed-by-the-police-are-black-points-report), it is not difficult to imagine who are the main targets of police operations powered by the technology-fueled suspect-hunting system. A survey by the Security Observatories Network in five states has already shown that [90.5% of prisoners for facial recognition in Brazil were black](https://theintercept.com/2019/11/21/presos-monitoramento-facial- brazil-blacks/). Algorithms reproduce society's racist biases, and their large-scale use is especially worrisome in Bahia – a state that has [the highest percentage of blacks in Brazil](https://g1.globo.com/ba/bahia/noticia/ 2019/05/22/one-in-5-people-in-bahia-declares-themselves-black-points-ibge.ghtml). A report produced in 2019 by the Bahia Public Defender's Office showed that 98.8% of those arrested in the act in Salvador are black. If penal selectivity already exists in the analogue world, the use of technology can worsen this scenario. In addition to amplifying the racial bias already present in the security forces, they are also subject to errors – in the case of Davi. Several [studies](https://ieeexplore.ieee.org/document/6327355/citations?tabFilter=papers#citations .) have already shown that [black and Asian people](https://nvlpubs.nist.gov/nistpubs/ ir/2019/NIST.IR.8280.pdf) are most often misidentified by facial recognition systems – especially black women. The computer scientist and researcher at the Massachusetts Institute of Technology Joy Buolamwini, who has black skin, identified the difficulty that the system has in recognizing the faces of black people when she positioned herself in front of a camera and did not have her face identified immediately. by artificial intelligence. This only happened when he positioned a white mask in front of him. These mistakes are already unacceptable for commercial purposes – but, used for public safety, they have devastating potential. “The black population already suffers daily from the stereotype of criminals, from microaggressions that involve excessive surveillance in commercial establishments, whose intentionality is easily denied, to cases of undue and unfair arrests”, wrote Rosane Leal da Silva and Fernanda dos Santos Rodrigues from Silva, researchers in Law at the Federal University of Santa Maria, in academic article published in 2019. “With a technology in which the algorithm itself will fulfill this role of mistakenly indicating black people as potential suspects of a crime, again they will be 'subject to the automation of constraints and violence, such as improper police approaches and untrue attribution of criminal records'”. According to Luciano Oliveira, PhD in Electrical and Computer Engineering from the University of Coimbra, Portugal, and a specialist in the field of Computer Vision, it is the phase of comparing the images with the database that is the biggest challenge of a system of this type. “If he trains the algorithm with more white people's faces, he may have more problems finding black people,” explains Oliveira. In the case of Bahia, as most of the suspect database is made up of black people, the algorithm needs to be trained precisely on this population so as not to make mistakes. “If he doesn't train, he won't find any faces, or he will find few faces”, says the researcher. As the majority of the population in Salvador is black, it is likely that the greatest number of errors occur with this portion of the population. This is where another issue comes in: the very suspect databases used in the comparison are problematic and racially biased. “We need to question what guarantees we have that this bank of warrants has correct data,” Pedro Diogo, a lawyer who researches Technologies for Surveillance and State Racial Terror at the Federal University of Bahia, told me. According to him, there are a large number of arrest warrants that remain open, despite no longer having functionality – errors in names, for example, are common. So, there is a great chance that someone will be identified because of a mistake in the Public Security Secretariat's own database. “One of the biggest risks of installing these systems is the way it makes it possible to combine specific problems of these technologies, such as racial bias, with traditional problems of the Brazilian penal system”, says Diogo. According to him, the number of erroneous identifications in the system, such as the one that happened to Davi, is not even counted. If the PM identifies the system error, the action is terminated on the spot without a record of occurrence. “Police institutions in the country were organized from the beginning to persecute black people, both enslaved and freed, in favor of expropriating labor, capital, land and production for the accumulation of property in a eugenics project”, says researcher Tarcizio Silva, who studies so-called algorithmic racism and works to promote digital security and defend against algorithmic damage at the Mozilla Foundation. As the algorithms are programmed by a racist structure, the tendency is that they reproduce, without restraint, the bias of thinking that every black person is a potential criminal. “The police that use this system carry out massacres, murders and disappearances, either officially or through militias and death squads. And here comes the facial recognition system to be installed and to expand the state's ability to promote terror in the face of the black population of this country”, says Diogo. Despite being released shortly after the police approach, the sequels remain in Davi's life. “I was really scared, because I've been taking the subway for a long time and I always take the same route. Cameras have always filmed me and just that day confused me,” he told me. The police approach is always on the lookout – now, powered by cameras and police intelligence.