Introducing Middle School Students to "The Secret Ghost Workers of 'Artificial Intelligence'"
Upcoming Book Clubs:
We’re reading The Digital Delusion: How Classroom Technology Harms our Kids’ Learning - And How to Help Them Thrive Again by Jared Cooney Horvath. Join us on Wednesday, April 22nd at 7:00 PM ET. You can register here. You can read more about our mixed feelings and reasons we’re reading this book in this blog post.
We’re also reading Disabling Intelligences: Legacies of Eugenics and How We are Wrong about AI by Rua M. Williams. Join us on Wednesday, May 20th at 7:00 PM EST. You can register here and purchase the book here.
Ghosts in the Machine AI Documentary Screening: Details are forthcoming about screening the film. Learn more on the documentary website.
Critical Tech Hall of Fame: We are welcoming submissions for our forthcoming Critical Tech Hall of Fame. If you are interested in writing a submission, click here to submit your nomination by Sunday, April 30th.
Post By Elizabeth Bacon
Elizabeth teaches middle school computer science at Wildwood School in Los Angeles
Like many teachers, I've recently been doing a lot of reflection about the world that we're preparing students to engage with. What frameworks will they have to make sense of our shared future, when we can't even seem to agree on our past? How will we teach them not just to find answers, but to ask the right questions? Do we need to prepare them for a predetermined future, or do we get to form it together? As I build out a new computer science curriculum at a new school, the Computer Science Teachers Association is updating its standards for students, and school administration has granted me a gift rare in the teaching world: the time to think more deeply about what it's all for.
I know what I'm "supposed" to be thinking. We're preparing students "for a world powered by computing." But what does that mean? In a literal sense, computers consume power, not produce it. And metaphorically, the story is a lot more complicated. Computing has never been a value-neutral source of power to which "the world" has access, and teaching students how to be more effective users or designers of technology doesn't automatically put their hands on the levers of power that computing has historically been used to manipulate.
With its new emphasis on artificial intelligence, the technology industry has doubled down on claims that its products will democratize pretty much everything, from education, to the arts, to scientific research. Thus, it's the responsibility of educators to prepare students to use these tools effectively and responsibly. Companies have smoothed the path for teachers willing to implement this vision, offering a slew of professional development opportunities that will bring "AI" into your classroom and habituate your students to technologies currently classified as AI.
These programs tend to focus on the mechanics of data-driven approaches, such as large language models, and how to use them. While social and ethical issues of "responsible use" come up, the underlying assumption is that the most important skills and understandings relate to using the tech well. In the words of American Federation of Teachers (AFT) president Randi Weingarten, when announcing the AFT's partnership with Microsoft, OpenAI, and Anthropic, "educators and school staff will learn about AI—not just how it works, but how to use it wisely, safely and ethically."
Frankly, our students deserve better. They deserve curricula that recognize the full scope of their humanity, outside of their roles as consumers and users of technology, and they deserve an education that allows them to explore the myriad ways their lives and choices are connected to others, across the world and throughout history.
I'm fortunate enough to work at a school that has the freedom to encourage these sorts of educational experiences, not just in the regular classroom, but through workshops that investigate the ways that contemporary issues intersect with group identity and dimensions of self. Each month, students choose between diverse topics such as Bad Bunny, energy drinks, and textile arts. Under the guidance of a teacher, they look at how different identity markers affect the way that individuals experience the world and connect with the topic at hand.
When tasked with designing my first such workshop for the middle school, I saw an opportunity to address sociotechnical myths and the social impacts of technologies. (With their strong sense of fairness, budding awareness that adults have not been giving them the whole truth, and predilection for dystopia, middle school students are not a hard sell on this sort of thing.) Other workshops had addressed environmental impacts, algorithmic bias, and ethical use of AI, so the experience of data workers seemed like a good topic. After some consultation with a sixth grade student, I settled on a title: "The Secret Ghost Workers of 'Artificial Intelligence'".
The workshop was popular enough that we had to turn kids away, and I started to worry that the provocative title had set expectations too high, that students would find labor exploitation and deceptive marketing relatively mundane. On a more personal level, I was concerned that an educational background focused more on algorithmic design than historical context had left me ill-prepared to facilitate the lesson. Imposter syndrome started to creep up on me, a computer science teacher trying to worm her way into the humanities. Of course, as often happens, the students rose to the moment and blew me away with their engagement, empathy, and insight.
The Lesson
The lesson opened with a discussion of the “Mechanical Turk” which toured Europe in the late 18th and early 19th centuries. An early example of “fauxtomation,” the Mechanical Turk was marketed as an automaton with advanced chess playing skills. In actuality, a human chess player hid inside the desk and controlled the puppet’s movements. Students discussed whether they’d prefer to be the presenter or the person operating the puppet (“The presenter!”), why deception was central to the success of the act (“Robots are cool!”), and why they thought the puppet was designed to resemble someone from outside the area it toured (“It makes it more exotic.”).
As the discussion turned to whether they thought it was a good example of automation, the students were divided. To some, it did represent automation, in that the visible game of chess was being played by the machine. To others, this was not true automation, because it required a human operator.
Automation, of course, never happens automatically. It takes a lot of work to design, implement, and maintain “automated” systems. The students' next task was to investigate the nature of that work and its impact on the workers (whom Mary L. Gray and Siddharth Suri described as "ghost workers" in their book, Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass). Students broke into groups and read through worker descriptions, filling out profiles and matching the hidden labor to a familiar automated technology, then shared what they'd learned with each other.
As they began to make sense of the stories, various themes emerged. Much of the profiled work was done outside of the United States, and students recognized that differences in wages and labor conditions across the world would affect where companies look for workers. Some students also pointed out that the appeal of these services to the consumer was the “coolness” factor of robots and other automated systems, not necessarily the quality of the service itself.
We wrapped up with some storytelling about Ned Ludd and the factory system, after a particularly engaged student made a connection between data work and the child laborers of the Industrial Revolution. The kids were riveted, and the ensuing discussion drew out some nuance around our attitudes toward new technologies and the particular ways that they're used.
Student feedback on the workshop was positive, and I was pleased with how it went, despite feeling like an imposter, trying to teach so far outside my subject area. While I had received some support from a humanities teacher in the high school, as well as from some academics with more experience in this type of critical analysis of technology, I would have been more comfortable with a real humanities teacher in the room. Regardless, I came out of the workshop excited to build on what we'd started.
A few days later, I spoke to a sixth grade humanities teacher about her own workshop, which had focused on the environmental impacts of artificial intelligence. I was interested in what she'd done and was hoping to get some ideas for improvement on my own lesson. She generously shared her resources, but with the caveat that she was not the "expert" I was, and that maybe the next time we could work together on something. I was baffled. She taught ancient civilizations. Wasn't she the expert on technology as a cultural artifact, on its impacts on societal development? I jumped at the chance to collaborate.
So in the end, an unexpected benefit of the experience was that it sparked connections between the computer science department (i.e., me) and humanities teachers in the school. As a result, we’ve been collaborating more closely on supporting students in building a robust understanding of the impacts of technology and the many societal factors that influence its design, development, and use.
I share this experience not as a turnkey solution to the problems in current "AI literacy" curricula, but as a reflection on the possibility of educators creating something that better supports our values and the aspirations that we have for our students. As someone who spent a decade writing computer science curricula at scale, I no longer think that it's the best way to propagate meaningful educational experiences.
I've also realized that it's worth it for me to design high-quality educational experiences for just my students, that I don't need to "scale" them for mass use. I'm extremely fortunate that I'm at a school that can support teachers with the time and resources to do the work required to develop such experiences. I don't have the pressure to grab something from a tech company that doesn't know or care about my students, or to automate my job with a piece of software that's incapable of understanding their needs.
Education, at its most meaningful, is a community-based endeavor built on interpersonal relationships, firmly situated in the local context. I don't know whether our school's recent production of Newsies was a factor in how interested the students were in the implications of child labor, or whether the Waymo depot just one block from our campus prompted them to wonder about how exactly those cars work, but I do know that this was an experience designed for these kids (and for me!), at this moment, based on factors too subtle and too numerous for me to enter into a database.
As a computer science teacher, I can feel isolated as the only one in my department, but this has also forced me to look to other disciplines for collaboration opportunities. Again, this requires systemic support, in terms of cultural norms and sufficient time to take advantage of these opportunities as they arise. It also requires trust in local teacher expertise, and understanding that solutions don't need to be prepackaged or scaled to be useful.
When I think back to the world we're preparing our students to engage with, I'm not sure that I want it to be a world powered by computing, especially if computing is just a proxy for top-down solutions that don't honor the humanity of those they're imposed upon. I don't have a pat answer for what I want the world to be "powered by", and I think that's probably a good thing. That's something I want to explore as part of a community. It's something I want to support my students in exploring for themselves.