When technological advances criminalize by race: In Brazil, 90% of people arrested based on facial recognition cameras are black

facial rec. capa

Note from BW of BrazilLet’s face it, we all love technological advances and are in one way or another addicted to these advances. The evidence is all around us. Look how much time people spend on this or that social media platform. Look at all of the addictive trappings of apps that we use on our cell phones or the hours upon end that our children spend virtually killing people on video game screens. On New Year’s Day, as I took in yet another serving of New Year’s Eve’s bagre (catfish) and bacalhau (codfish) soufflé, I looked at the six teenagers in the den and saw all of them gazing at their cell phones, enjoying themselves in their own separate worlds, even as they were sitting less than an arm’s length from each other, all of them cousins. Technology has definitely conquered us, and most of us don’t even know it.

I have to admit, I frequently think of totally “unplugging” in the Matrix sense of the word and deleting all of my social networks. I don’t spend nearly as much time on them as other people I know. I often wonder, how do people do it? I’ve been excluded from some social networking groups because of my lack of activity, but I simply don’t have the time nor the interest to always participate in this madness. There seem to always be the new social network that is better than the previous one, and with each of them, you must always create profiles, passwords and volunteer a certain amount of your information.

The obvious spying and sharing of information are just two more reasons why I sometimes consider unplugging. Three examples why will suffice. Several months ago, one of my e-mail accounts that shall remain nameless kept insisting that I volunteer my phone number for “security” purposes. My view was, the people behind this electronic mail service already have enough of my information as it is, so why do they need my phone number also? But then one day, using a public computer, this e-mail company wouldn’t allow me to log in without proving that I was the owner of my own e-mail account (I know, I know, I am not the actual owner of my e-mail, but that’s yet another issue).

Notice the creepiness of them even knowing that I wasn’t on my regular home computer. That particular day, I didn’t have the urgency of needing to log in to that particular e-mail account, so I didn’t give up my phone number. But several months later, I really needed access to this e-mail account, once again on a public computer, so I ended up entering my phone number so that they could send me a security code via phone text message.

In another example, a social network requested my phone number, again, which I refused to volunteer, but said network ended up purchasing another network that I use and eventually, they showed me my own phone number onscreen when I hadn’t even shared it. In the third example, last week, I was checking prices of a cheap new computer through Google’s search engine. Why was it that, about an hour later, in an unrelated social network, suddenly some of the very computers that popped up in my Google search were now prominently featured in the ads in the social network? Privacy? Gone.

But every new gadget and app that “they” introduce will have some feature that we find necessary, and, without even reading the terms and conditions of using this app or device, we sign away another facet of our freedom (which we in fact don’t really have anyway) and/or privacy. I hope you can see where I’m going with this. In a city like São Paulo, police don’t even need to pursue you to deliver any of the absurdly pricey traffic violations they hand out. Why? Because on major roads and highways, there are security cams everywhere. So, when you’re driving a few kilometers over the speed limit, you can feel assured that a cop isn’t gonna pull you over and issue that ticket, so, you can breathe easy, right? Well, maybe for a week or two until that ticket with a photo of your license plate demonstrating your violation arrives in the mail.

Don’t believe you live on a prison planet yet? Well, just wait a little while longer. It’ll hit you sooner or later. Which leads me to today’s piece on facial recognition. As if we don’t have enough to deal with for the “sin” of being black. Now there’s this.

facial rec.

When technological advances criminalize by race; reinforcing racist logic, facial recognition used as a weapon of public “security” threatens black Brazilians

In Brazil, 37 cities in 16 states already use facial recognition in some way

March of 2019. Carnaval in Salvador, Bahia. While enjoying the Momo festa at the Barra-Ondina circuit, Marcos Vinicius de Jesus Neri, 19, was arrested by the Bahia Military Police. The fact, celebrated by the State Government, marked the first arrest via facial recognition technology in Brazil.

Vestido de mulher, suspeito é preso após reconhecimento facial
Marcos Vinicius de Jesus Neri, 19, dressed as a woman during Carnaval, was arrested after facial recognition

Two months later, at an event in China, where the results of the facial recognition policy implemented by his management were presented, the governor of Bahia, Rui Costa (PT), said that the experience had been successful, of which he was “very happy with the initial result” and that the objective was “to advance and provide more security to Bahians”.

In addition to Rui Costa’s well-known vision of public safety – his commemoration of the Cabula bloodbath – a critical look is needed at the effectiveness of technologies such as facial recognition to ensure security for the population.

An important voice in this regard is that of Pablo Nunes, research coordinator of the Network of Security Observatories. Nunes presents a series of notes that call into question the use of this type of technology: “The body part used in biometrics, whether digital or face, is never fully analyzed. This means that some points on the face or finger are chosen and, based on the distances between these points, the probability that the fingerprint or that face belongs to the person registered in the database is calculated. In the case of the human face, the possibilities of differences or changes in these distances are much greater than in a digital one, since a person ages, may be yawning, blinking”, he highlights.

The findings presented by the researcher can already be confirmed with this year’s information. A micareta* during the Carnival of Feira de Santana, Bahia, exposes some of the blind spots of the system. At that festivity, the video surveillance system captured the faces of over 1.3 million people, generating 903 alerts and resulting in 18 warrants and 15 people being arrested. This means that of all alerts issued from face recognition, over 96% did not result in any security measures.

But before it seems like a waste of public resources and a useless allocation of time and personnel, the facial recognition policy means something of a much greater severity: it is another racist action by the Brazilian state. In this regard, data from the Security Observatory Network draws attention:

Between March and October this year, 151 people were arrested for face recognition technology in four states (Bahia, Rio de Janeiro, Santa Catarina and Paraíba). The average age of people arrested was 35 years old.

When there was information about race and color, or when there were images of the people approached, the racial element becomes obvious. An unprecedented study by the Security Observatory Network – which has violence study centers at the University of São Paulo (USP), Cândido Mendes University (RJ), Federal University of Ceará (UFC), Iniciativa Negra  (Bahia) and the Cabinet of Legal Advice to Popular Organizations, Pernambuco – says that 90% of the 151 people arrested based on facial recognition cameras are black. Bahia comes in first with 51% of the arrests, followed by Rio de Janeiro with 37.1%.

The main motivations for approaches and arrests were drug trafficking and theft, 24% each. Blacks are the majority in prison due to facial recognition. Facial recognition technology is the girl of the eye in the public safety area. Therefore, a great threat to black people, especially those who live in peripheral neighborhoods of Brazilian cities.

The federal government is committed to developing this system and created Ordinance No. 793 in October 2019, which allows the use of money from the National Public Security Fund to improve video surveillance.

“Financial incentive for the actions of the Axis Against Violent Crime, within the scope of the National Policy of Public Security and Social Defense and the Single System of Public Security, with the resources of the National Fund of Public Security”, says the text of Sérgio Moro published in the Diário Oficial.

The package of measures to combat crime presented by Moro, the Minister of Justice and Public Security, pays special attention to technology as “fostering the implementation of technological solution for intelligence, attendance and unique record of occurrences, dispatch centers, georeferencing of vehicles, predictive policing, and body or vehicle cameras,” he explains in the ordinance.

The former judge intends to increase the time of maintenance of the genetic profiles of prisoners for up to 20 years after the serving of sentences. Detail: Moro collects the data of provisional prisoners or that have not yet been judged. The National Council of Justice (CNJ) shows that 41.5% of the more than 800,000 inmates have been detained without conviction.

It’s worth remembering that the majority of Brazil’s prison population is made up of black men and women. Afro-Brazilians account for two-thirds of detainees, or 64% of the total.

Data collected by law enforcement agencies are sufficient to raise concern about the use of facial recognition by police forces. Justice in Brazil has a side, as the stories of Preta Ferreira and Bárbara Querino – two black women arrested for crimes they did not commit – show. Both are on probation. For Querino, who was jailed for 1 year and 8 months, the system aims to exterminate blacks.

Bárbara Querino - Preta Ferreira
Bárbara Querino and Preta Ferreira

The numbers of the study made by the Network and the popularization of this technology as a public policy – there are already 37 cities from 16 states of the country that in some way use this instrument – therefore, leave no doubt: facial recognition is an update of the structural racism that guides state action and bases criminal justice action and public security mechanisms in Brazil, reinforcing the idea that blacks are always suspect targets.

In the early days of this year, Victor Mendes and Leonardo Nascimento were “mistaken” by security cameras – not specifically for face recognition – in Rio de Janeiro and unfairly arrested. They are examples of the racist bias of security image-based denouncements.

An individual’s gaze in at a video is also behind facial recognition technologies. That is, those who manipulate robots and algorithms are human beings, who end up reproducing their prejudices and corroborating the security policy implemented.

But the issue is even more complex and international in scope, as facial recognition is just one of many technologies marked by racist criteria.

According to New Scientist, the system used in UK utilities, such as issuing passports, worked to identify white people, but failed with black men and women.

In the United States, a survey by the American Civil Liberties Union found a high rate of error in police face recognition technology, affecting black people to a greater extent. The vagueness and debate surrounding racist use of the tool has led some cities to ban the use of technology, such as San Francisco and Somerville.

Reconhecimento facial vira ameaça

In this regard, it is worth mentioning the Timeline of Algorithmic Racism, the result of the PhD research ”Data, Algorithms and Racialization in Digital Platforms”, by Tarcizio Silva, from the Universidade Federal do ABC, which demonstrates how productive chains of digital platforming (social media, applications and artificial intelligence) are constituted by racial biases.

Here are some cases that exemplify what Tarcizio qualifies as racial microaggressions online:

  • Google systems that allow companies to display crime ads specifically to African Americans
  • Results on Google Images featuring hyper-sexualized content for searches like “meninas negras”, meaning ‘black girls’;
  • Tagging photos of young black people tagged “gorilla” by Google Photos;
  • Conversational robots of startups that do not find black women’s faces and computer vision systems that error in the gender and age of black women;
  • Search mechanisms for image banks that make black families and people invisible;
  • Apps that transform selfies and match beauty with brancura (whiteness);
  • Computer vision application programming interfaces that confuse black hair with wigs;
  • Natural language processing tools that have biases against black language and themes;
  • Facial analysis of emotions that associates negative categories with black athletes.

In October of this year, one more case gained resonance in Brazil, when searches on Google search engine with the combination of words “mulher negra dando aula” (black woman teaching) directed to pornographic content, aggravated when the searcher did not display any sexual material for searches with “woman teaching ”or“ white woman teaching ”.

Sil Bahia, holder of a Master’s in Culture and Territorialities from the Fluminense Federal University (UFF) and director of Olabi, whose arm is Pretalab – an initiative to encourage the leading role of black and indigenous women in the communication and technology sectors, warns of threats against the black population caused by these tools.

A member of the Research Group on Policy and Economics of Information and Communication at UFRJ, Silvana Bahia points out that algorithmic racism reproduces and intensifies the racism present in society. In her words, “algorithmic racism occurs when mathematical or artificial intelligence systems are ruled by skewed/biased information that feeds and governs their functioning. The consequences are many, but perhaps the biggest one is increasing inequalities, especially at a time when we are increasingly having many of our machine-mediated tastes and policies as technology advances.”

A study of PretaLab, an initiative of the organization Olabi, in which Silvana Bahia acts as coordinator, demonstrates how algorithmic racism is a global phenomenon that can be verified in different platforms. “Algorithms are a set of instructions that consult query databases to perform their function/action. If there is no diversity in the production of new technologies, and technologies are produced by people, the actions of the algorithms will not consider many aspects and/or reinforce others.

Algorithms work with probability, not certainty, the problem is that we are delegating many of our decisions to machines without taking this into account. And besides, we are not encouraged to think about ‘who makes the technologies,’” says Bahia.

It is not a question of demonizing technology, but of drawing attention to the influence of racism in building strategies against violence. In the eyes of justice, blacks are seen as the villain. Guilty most of the time, even if they haven’t committed a crime.

As much as authorities and the President himself deny the existence of racism and try to throw hot rags on the genocide of the black population, the fact is that if facial recognition follows the logic of criminalization of Afro-Brazilians adopted by the state, the majority of Brazilian population will face new challenges to survive.

“We can connect, produce content that strengthens black identity, but at the same time when we look at the data we realize that what happens offline is reproduced online. Black women suffer the most from internet exposure, violations of rights and so on. I think that in order to turn the tide we need to encourage people to want to understand better about these processes and not from a technical standpoint, but mainly about their impacts. It’s important to reinforce whenever technology is not neutral, which reproduces behaviors, worldviews, the culture of its creators, and we know that the technologies we use are mostly created by men, whites, heterosexuals in the northern hemisphere,” emphasizes Sil Bahia.

The different data and numbers have shown that either in Brazil or in other parts of the world, just as traditional media, such as radio and television, digital platforms are not neutral at all and operate under the logic of racism. Without proper debate and regulation to ensure people’s privacy, the use of digital technologies in public security actions can extend state control over individuals and reinforce aggression and violence against historically oppressed racial groups.

With info from Hypeness and Brasil de Fato

* Off-season celebrations similar to Brazilian Carnival.

About Marques Travae 3384 Articles
Marques Travae. For more on the creator and editor of BLACK WOMEN OF BRAZIL, see the interview here.

Be the first to comment

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.