Inspired by the movie “Coded Bias” (2020)
In the rush hour smog of a 2019 afternoon, Londoners traveled about, largely unaware of the presence of the Metropolitan Police scanning for criminals. The police detected a hit, and there was a commonality between this hit and the majority of others. The suspect was a black student. He was subjected to 15 minutes of interrogation and fingerprinting until 4 plainclothes police officers had determined that the boy was misidentified. However, not before being subjected to humiliation in front of his friends and hundreds of passersby. Systemic racism in the justice system is a truth so commonly detected that most have become desensitized to its vileness.
This incident of racial profiling was different than most. The perpetrator of bias wasn't human at all but an artificial intelligence powered automated facial recognition (AFR) surveillance system. The specific system in use was a closed-circuit TV (CCTV) system with a false-positive rate of 98%. One anonymous police officer commented, “young black men are more likely to be stopped and searched than young white men, and that’s purely down to human bias.” Why would human feelings have any bearing on technology that doesn’t have a mind of its own?
Artificial intelligence is a catch-all phrase for a set of algorithms that imitate human processing. Algorithms are computational structures and programming scripts that use historical information to predict future outcomes. The historical information is provided to the AI algorithm via datasets, which can be provided by data engineers in the practice of supervised learning, or “discovered” by intel through unsupervised learning.
In the same way that a child is susceptible to replicating the bad habits of their parents, AI does the same. It internalizes the discriminatory disposition of humans and modifies its thinking to do the same. An aphorism for artificial intelligence is “garbage in, garbage out.” The garbage that goes in is decades of genocide, institutional racism, sexism, ageism, and misinformation. As AI is increasingly utilized in our society, the scope of the garbage propagated will widen.
Robots are under development in international labs that intend to use facial recognition to eliminate active shooters. Whereas the young Londoner who had been flagged by the police had suffered the fate of embarrassment, the fates of future innocent BIPOC (black, indigenous, and people of color) may be far worse: death. AI is developed to be like humans, who inherently make mistakes, and these mistakes, learned from and resulting in police brutality, will be mobilized by robots.
There is, however, hope for improvement. Those who develop AI are primarily a homogenous group of white men who insert their own unconscious biases into code. Organizations like “Black in Engineering, Data for Black Lives, Black Girls Code, Black Boys Code and Black in A.I.” (Berreby, 2020) seek to make the workforce of big technology more reflective of the world’s actual demographics. With more ethical technology, AI can pave a future of equality without reinforcing the pervasive hatred of our past and present.
Babuta, A., & Oswald, M. (2019). Data Analytics and Algorithmic Bias in Policing. Royal
United Services Institute, 2–19.
Benton, A. (2019, September 6). An AI-Run World Needs to Better Reflect People of Color.
Berreby, D. (2020, December 3). Can We Make Our Robots Less Biased Than We Are? The
New York Times. https://www.nytimes.com/2020/11/22/science/artificial-intelligence-
Chertoff, P. (2020, February 7). Facial Recognition Has Its Eye on the U.K. Lawfare.
Metz, C. (2019, November 12). We Teach A.I. Systems Everything, Including Our Biases. The
New York Times. https://www.nytimes.com/2019/11/11/technology/artificial-
Shalini, Kantayya. (Director). (2020). Coded Bias [Film]. 7th Empire Media. What is Artificial
Intelligence? How Does AI Work? | Built In. (n.d.). Built In. https://builtin.com/artificial-