Who Sounds the Alarm?: A Preview of the 4th Annual Civics of Technology Conference

Lots Happening at Civics of Tech!

CONFERENCE!

Our fourth virtual conference schedule is out! Head over to our conference page to preview the schedule and register for the conference.

TECH TALK!

Next Tech Talk on August 5th: Join us for our monthly tech talk on Tuesday, August 5th from 8:00-9:00 PM EST (GMT-5). Join an informal conversation about events, issues, articles, problems, and whatever else is on your mind. A registration link will be added in next week’s newsletter.

BOOK CLUBS!

July Book Club: We’re reading The AI Con, by Emily Bender and Alex Hanna. Join us for a conversation on Tuesday, July 22 at 8:00 PM Eastern Time, led by Charles Logan. Register on our events page here!

Conference Book Club: We’re also reading Empire of AI, by Karen Hao, and we’ll be holding our book club conversation at the end of our first day of our conference: Thursday, July 31 at 3:00 PM Eastern Time. See more on our events page.

by Dan Krutka

As a former social studies teacher, my classes regularly included people who we tend to look back at heroically: civil rights leaders, freedom fighters, whistle blowers, truth-seeking journalists. I always sought to remind my students that it was easy to speak highly of those who pursue justice in the past, but much, much harder to stand for justice in the present. Even when we know what is right, speaking with moral clarity requires courage. 

At our 2025 conference, our two keynote speakers are individuals who have long spoken about emerging (educational) technologies with moral clarity. They both regularly point out patterns of injustice that have grown increasingly brash as technology corporations merge with an oligarchical politics to push AI at teachers and students in our schools. Their keynotes can serve as opportunities to wrestle with the theme of our conference and determine directions for communal resistance to artificial systems. 

Each day of our conference begins with a keynote, followed by a series of sessions that include research presentations, lesson sharing, workshops, and panels. We welcome educators, students, parents, and researchers who want technology that helps cultivate a more humane and just world.

Conference Overview

  • Theme: Communal Resistance to Artificial Systems

  • Dates: July 31st and August 1st, 2025

  • Time: 11 a.m.–3 p.m. EST both days

  • Place: Online and free to all

Keynote Day 1 – Audrey Watters

In line with our introduction, Audrey Watters is known as “ed-tech’s Cassandra.”  She founded the long-running blog Hack Education and now writes the newsletter Second Breakfast. Her work tracks the myths, marketing, and BS in educational technology. 

Her book Teaching Machines: The History of Personalized Learning uncovers a century of pre-digital machines that failed to fix schools as promised. Preceding our now-regular book club conversations, several of our current board members gathered to discuss Teaching Machines. Charles Logan summarized those efforts in this blog post, and those talks produced two free resources. Charles Logan wrote a reading guide with recent interviews and questions for every chapter. Dan Krutka, Marie Heath, and Charles Logan created an Inquiry Design Model (IDM) lesson with the compelling question, “Whose interests do ‘teaching machines’ serve?” The IDM includes 11 sources and compares historical teaching machines to ClassDojo today. You can copy the lesson from our curriculum page and adapt it for your own class.

While Audrey’s work expands into the past, her blogs are intensely focused on what’s happening today and what we can do about it. We look forward to her voice challenging us to organize refusal.

Keynote Day 2 – Dr. Chris Gilliard

Dr. Chris Gilliard studies digital privacy, surveillance, and the intersections of race, class, and technology. He has been profiled in the Washington Post, but you may know him from his social media account Hypervisible—formerly Twitter and now Bluesky—where he curates stories and posts that regularly remind us that “Every future imagined by a tech company is worse than the previous iteration,” which is included in our technology quote activity. As an example of his work, he co-authored a 2021 article in Real Life with David Golumbia where he coined the term luxury surveillance. They opened the article explaining:

One of the most troubling features of the digital revolution is that some people pay to subject themselves to surveillance that others are forced to endure and would, if anything, pay to be free of.

Consider a GPS tracker you can wear around one of your arms or legs. Make it sleek and cool — think the Apple Watch or FitBit —  and some will pay hundreds or even thousands of dollars for the privilege of wearing it. Make it bulky and obtrusive, and others, as a condition of release from jail or prison, being on probation, or awaiting an immigration hearing, will be forced to wear one — and forced to pay for it too.

In each case, the device collects intimate and detailed biometric information about its wearer and uploads that data to servers, communities, and repositories. To the providers of the devices, this data and the subsequent processing of it are the main reasons the devices exist. They are means of extraction: That data enables further study, prediction, and control of human beings and populations. While some providers certainly profit from the sale of devices, this secondary market for behavioral control and prediction is where the real money is — the heart of what Shoshana Zuboff rightly calls surveillance capitalism.

I recommend reading the article in full. With such work, Dr. Gilliard redefines the terms of a debate where “smart” technologies are often treated as individual consumer choices for those with means and shows us where communities should be pushing back. 

We hope both our keynotes can serve to inspire our presenters and participants to join in the cause. 

Sample of Sessions

Each hour offers three live online rooms with 1-3 presenters. Some examples of sessions include:

  • Hands-on resistance workshops

    • Using Urgentcraft to Stand Against Surveillance Ed Tech” where Jeffrey Austin will lead to participants will use Paul Soulellis’ “urgentcraft’ principles–a set of tactics for solidarity and public advocacy in times of crisis–to build community action plans to resist the rise of AI-powered surveillance ed tech that threatens to erode efforts to build and grow police-free schools rooted in trust and safety.

    • Owning the Means of Publication: The Radical Possibilities of Community Creation” where Zoe Wake Hyde will lead a hands-on workshop guides attendees through making their own zine, start-to-finish, as a means of reflecting on the radical histories and potential futures of publishing in the name of circulating our collective knowledge.

  • Community case studies and non-profit work

    • Code Green: A youth-participatory educational program enabling youth with data science for social advocacy” where Bavisha Kalyan and students will detail how the Peoples Public Lab created a curriculum around justice, data science, and mixed-methods research through a summer program in Newark, New Jersey. The presentation will share lessons learned from liberatory and critical theory-based curriculum and program development and insights into youth perspectives and visions for the future of Newark.

    • When Communities Design the Future” where Kalkidane Yeshak will lead a session that examines how three American cities developed different approaches to resisting facial recognition technology, revealing that who participates in designing resistance determines whether communities can effectively democratize algorithmic power.

    • Empowering Communities by Developing Critical AI Literacy” panel with Dr. Pati Ruiz of Digital Promise, Shana V. White of the Kapor Foundation, Ace Parsi of iCivics, Zachary Cote of Thinking Nation will discuss working in non-profit spaces about how educators can empower communities through critical AI Literacy initiatives.

  • Classroom practice

    • GenAI Ethics Made Accessible: Facilitating Classroom Discussions” where four presenters will provide an accessible starting point for educators and students to engage in meaningful discussions about GenAI, utilising four ethical frameworks: Utilitarianism, Deontology, Rawls’ Veil of Ignorance, and Critical Analysis. The session includes ideas for practical activities that can be adapted for use across different higher education classrooms to help students reflect, question, and imagine more just GenAI & tech futures. 

    • Preparing Teachers for Disobedient Design: Cultivating Tech-Critical Classrooms from the Start” where Kristin Hemingway invites participants to reimagine teacher preparation as a space for critical tech resistance, using the lens of disobedient design to help early-career educators question and reshape the artificial systems embedded in their classrooms. Rooted in culturally sustaining pedagogy, the workshop offers practical tools and bold imagination to support new teachers in building classrooms where critique, culture, and care lead the way.

  • Youth and caregiver voices

    • Code Green: A youth-participatory educational program enabling youth with data science for social advocacy” where Bavisha Kalyan and students will detail how the Peoples Public Lab created a curriculum around justice, data science, and mixed-methods research through a summer program in Newark, New Jersey. The presentation will share lessons learned from liberatory and critical theory-based curriculum and program development and insights into youth perspectives and visions for the future of Newark.

    • Theorizing Caregiver Resistance to EdTech” panel with Charles Logan, Alexandra Thrall, Faith Boninger, Velislava Hillman, and Andy Liddell will discuss caregivers' resistance to the imposition of digital platforms in schools. We'll consider specific acts of everyday resistance and possible theories of change for realizing future mass organizing around issues at the intersection of technology, school, and our children’s rights.

  • Bonus book club at the end of day on Karen Hao’s Empire of AI

All sessions build skills for collective action, not individual fixes. 

Register and support

Registration is still open, and as always, the event is free. If you can, we accept donations to help us pay speaker stipends and keep our site running. 

See you online

We hope you can join us on July 31st and August 1st to listen, learn, and plan together. The alarms are ringing. We hope our conference helps us to collectively answer the call. 

Next
Next

Deleted by Design? The Politics of Erasing Educational Technology Access and History