Dismantling the New Jim Code

CoT Announcements

  1. Next Monthly Tech Talk on TUESDAY 09/05/23: Due to the conference being the same week, we did not hold a monthly tech talk in August. Our next one will be on Tuesday, September 5th, 2023 at 8-9pm EST/7-8pm CST/6-7pm MST/5-6pm PST. Learn more on our Events page and register to participate.

  2. Critical Tech Study: If you self-identify as critical of technology, please consider participating in our study. We are seeking participants who self-identify as holding critical views toward technology to share their stories by answering the following questions: To you, what does it mean to take a critical perspective toward technology? How have you come to take on that critical perspective? Please consider participating in our study via OUR SURVEY. You are welcome to share with others. Thank you!

  3. Next Book Club on 09/21/23: We are discussing Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil on September 21, 2023 @ 8:00 p.m. EDT. Register for this book club event if you’d like to participate.

By Marie K. Heath and Dan Krutka

The underlying belief that, “Technologies are not neutral and neither are the societies into which they are introduced” drives our work at the Civics of Technology project. Several technology, education, and society scholars influenced our understanding of technologies as forces which nudge individual and social behavior, including folks we reference regularly such as media ecologists Marshall McLuhan and Neil Postman who contended that technologies carry more force than is often recognized. However, for Dan and Marie, our understandings of the intersection of the non-neutrality of technologies with societal oppression are most grounded in more recent scholarship of Black women scholars who interrogate the embedded social systems of power and oppression within technology.

In particular, Dr. Ruha Benjamin’s (2019b) work on technologically embedded injustice  and discriminatory design has helped us frame our own inquiries into the ways technologies encode and reproduce systemic inequities. Her term the New Jim Code, playing on Michelle Alexander’s the New Jim Crow, offers a clear and helpful shorthand for identifying harmful biases coded into the algorithms and technologies which influence nearly every part of our mediated lives. Like the Jim Crow laws of the late 19th and early to mid 20th century U.S., the New Jim Code exerts social control, segregating and subjugating Black, Latinx, Indigenous, Queer, and other marginalized groups.  In her 2019 book Race After Technology, Benjamin defines the New Jim Code as existing when new technologies are employed “that reflect and reproduce existing inequities but that are promoted and perceived as more objective or progressive than the discriminatory systems of a previous era” (pp. 5-6). The New Jim Code thus extends the New Jim Crow into the digital age and results in algorithms which perpetuate increased policing of Black and Latinx neighborhoods through predictive crime algorithms like PredPol; reproduce housing and lending inequities (Bartlett et al., 2022); simultaneously objectify Black women’s bodies (Noble, 2018) while ignoring their faces (Buolamwini & Gebru, 2018); and ignore the pain of Black women in health care settings (Benjamin, 2019a).

Dr. Benjamin’s work offers ways forward to dismantle the New Jim Code through abolition practices. First, she cautions us against looking for a technological fix to a social problem. A social justice bot will not save us, or as she quips, “slay centuries of racial demons” (2019a, p. 7). Rather, the abolition of injustice requires a social commitment to new and more just imaginaries—other futures which prioritize “equity over efficiency, social good over market imperatives” (2019b, p. 183). Coded equity audits provide another way to interrogate algorithms and their impacts. We can ask:

  1. What are the unintended consequences of designing systems at scale on the basis of existing patterns in society?

  2. When and how should AI systems prioritize individuals over society and vice versa? 

  3. When is introducing an AI system the right answer and when is it not? (Benjamin, 2019, p. 186).

Dr. Benjamin has created space for this work and these questions through the Ida B. Well’s Just Data Lab at Princeton where students “rethink and retool the relationship between stories and statistics, power and technology, data and justice.” Students interrogate data injustices through the lens of institutions and systems including education, the justice system, and health care. You can review the projects and activism of the students here. The Ida B. Well’s students have also shared their work at our 2022 and 2023 conference, and Dr. Benjamin spoke as a keynote in 2022, asking us to reimagine the default settings of technology and society.

Through our own work at the Civics of Technology project, we offer discriminatory design audits as a way to shift our understanding and use of harmful technologies. These audits encourage us to ask questions about the biases, default settings, and intersections with social harms already perpetrated on marginalized people. Dr. Benjamin’s TEDx talk explores discriminatory design in physical and virtual spaces. Dan and his students have developed an ed tech audit aimed at uncovering discriminatory design, which educators and students might use as they explore technologies. Finally, we turn to you --  what other ways can educators contribute to the struggle?

References

Bartlett, R., Morse, A., Stanton, R., & Wallace, N. (2022). Consumer-lending discrimination in the FinTech era. Journal of Financial Economics, 143(1), 30-56.

Benjamin, R. (2019a). Assessing risk, automating racism. Science, 366(6464), 421-422.

Benjamin, R. (2019b). Race after technology: Abolitionist tools for the new Jim code.

Buolamwini, J., & Gebru, T. (2018, January). Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on Fairness, Accountability and Transparency (pp. 77-91). Proceedings of Machine Learning Research 81, 1-15.

Noble, S. U. (2018). Algorithms of oppression. New York University press.

Previous
Previous

Here are “101 Creative Uses of AI in Education.” Are They Truly Creative?

Next
Next

Q&A with Rogers Brubaker, author of HyperConnectivity and its Discontents