I Hope this Blog Post Finds You
Civics of Technology Announcements
New! Technology Audit Curriculum: This activity provides a structured way to surface the ethical dimensions of technology tools. Drawing on four analytic approaches developed through the Civics of Technology project, educators and students can ask disciplined, critical questions that move beyond whether a tool “works” to whether it aligns with their educational values and responsibilities. An audit supports informed judgment about whether to adopt a technology as designed, modify its settings or uses, or reject it altogether. Importantly, this process also positions teachers and students as civic actors who can advocate for more responsible technology practices within classrooms, schools, districts, and communities.
Next Tech Talk: Please join us for our next Tech Talk where we meet to discuss whatever critical tech issues are on people’s minds. It’s a great way to connect, learn from colleagues, and get energized. Our next Tech Talk will be held on Thursday, March 5th at 12:00 PM Eastern Time. Register here or visit our Events page.
AERA Meetup: Meet up with us at AERA! Friday, April 10, 2026 4:00-5:30 PT at the Yard House in LA. Please register at our events page.
Author Bio: Bradley Robinson is a former high-school English teacher and current assistant professor of educational technology and secondary education at Texas State University. His research examines how digital technologies are reshaping learning and literacy.
One of the real gifts of being a teacher is when former students reach out, sometimes years or even decades later. Such a gift arrived in my email inbox a few weeks ago, when I received a message from a student I taught nearly twenty years ago at a high school in North Carolina. He’d been watching Nightcrawler on Netflix and remembered a book I’d lent him in freshman English, although he couldn’t recall the title, hence the email. But that partial memory opened up into what he described in his message as a whole evening of telling his partner about how we traded books and CD mixes. He then told me about his life, including his years in a neo-psychedelic rock-rap band with his brother, who sadly passed away a while back. The email ended with this: “I hope this email finds you.”
That line stopped me. And not because it was unusual or weird—and not (only) because it was the end of the message—but just because it was, because I’ve read emails that launch with “I hope this email finds you well” a thousand times and felt nothing, and here was a version of the phrase that actually meant something. He wasn’t sure the email would reach me. He hoped it would. It did.
I’ve been thinking a lot about what makes an email feel alive like that one did. And conversely, also about what makes so many others feel dead on arrival. In a recent article with Kevin Leander published in Learning, Media, and Technology, we examined MagicSchool AI, an educational platform that, among other things, helps teachers write emails to their students’ families. The feature, the Email Family Tool, is designed to ease the burden on overworked educators: type in a few details and, presto change-o, MagicSchool ‘magically’ produces a polished, professional message. The promise is efficiency. The result, we found, is something else. (Oh, and despite what the company’s name suggests, AI isn’t magic at all, even if it often gets marketed that way.)
Kevin and I coined the term synthetic affect to describe what platforms like MagicSchool produce: humanoid feeling generated algorithmically, circulating, in this case, through predetermined scripts of professional language. It looks like warmth. It’s a performance of care. But it’s unable to remember anything about relationships, like CD mixes and book exchanges. Sure, the words are ‘correct.’ They just don’t mean.
And while our study focused on an AI-powered educational platform, synthetic affect is a larger phenomenon. It’s a sort of artifact of how generative AI platforms process, reproduce, and circulate a simulacrum of feeling across lots of different domains. Just last week, for example, I came across a story about an AI controversy in Sweden. The folk-pop song, "Jag vet, du är inte min" (“I know, you’re not mine”) rose to the top of Spotify’s Swedish charts, with over five million streams in just these few weeks into 2026. The minor-keyed song tells a melancholy story of love lost, sung by a haunting voice that seems to drip with heartache. The odd thing, as journalists soon discovered, was that the singer had no social media presence. They weren’t on tour. They hadn’t done any interviews. This is because it was all a digital fabrication—it was AI. In response, IFPI Sweden, the body that governs the country’s music industry, moved to ban the track from the country’s charts.
The folks who generated the song pushed back strongly. They insisted that the song’s feelings were quite real because the real people involved in the creation process had experienced them. And that makes a certain kind of sense, I think, but it also misses the point. The issue isn’t about whether humans typed and clicked their way along to a song. It’s about the question of whether the sadness in the song was ever felt, whether it emerged through loss and memory, through a relationship that left marks on a person, and whether it could body forth from a human voice in a dark venue and land on an audience. Just as a MagicSchool email performs care, the song performs melancholy without having earned it. In both cases, the affect is produced—it’s machine-generated—rather than expressed. It circulates without feeling.
In the spirit of fairness, it’s worth noting that hollow emails didn’t begin with AI. In 2020, well before ChatGPT and its kindred clankers, the New York Times ran a piece with the title, “I Hope This Email Finds You Well” that captured, and not without a healthy dose of snark, how exhausted folks had become with the phrase. As the author Tim Herrera put it at the time, the greeting had been “exposed by the pandemic for its stodgy emptiness; a hollow, yet necessary, formality.” So we were already stuffing perfunctory pleasantries into emails. And maybe that in part helps explain why platforms like MagicSchool appeal to so many: if our professional communications were already filled with flat formulas for feeling, why not let a bot help handle them? But there’s a distinction to be made, perhaps, between a habit and an algorithm, between a convention we’ve grown careless with and a platform that produces synthetic feeling for sale and at scale.
What does all that have to do with education? Everything, I think. As anyone who’s ever done it can tell you, teaching is deeply relational work. It’s charged with feeling, powered by it. It’s not a ‘soft skill’ add-on to the more important cognitive labor. It is the very medium through which learning happens. Students need to trust their teachers. Teachers’ care needs to extend to students’ families and communities. These are the practices of teaching as much as, perhaps even more than, lesson planning and grading are.
So when platforms like MagicSchool promise to automate the relational, affective work of teacher labor, they’re operating on something deeper than time savings alone. Of course, that platform claims that the efficiency gains on offer will free up teachers to focus on what really matters to them, like relationships. But the Email Family Tool isn’t just clearing a teacher’s desk of onerous tasks to make room for meaningful connections. It’s effectively automating, at least partially, those connections. And that automation represents a reconfiguration of what it means for teachers, students, and families to be in relation.
And, crucially, that process is happening in a context where teachers are stretched paper-thin. They’re underpaid. They’re undersupported. They’re over-blamed. And, especially in the U.S., they’re asked to carry the baggage of a society that is increasingly abandoning its commitment to public education. Enter MagicSchool, which steps into the crisis and offers efficiency. But efficiency towards what? And at what cost to the relationships that make teaching meaningful and impactful on the lives of young people?
I wish I had a tidy answer for you here. The truth is that, even among colleagues who study educational technology critically, there are ongoing debates about what all this means for teachers working in today’s and tomorrow’s classrooms. Some argue, including here on Civics of Technology, that educators should actively resist AI systems like MagicSchool, that using them amounts to a kind of capitulation to the forces degrading the profession. I get that, but I often feel a bit more ambivalent. Having talked to teachers about this—many of my graduate students are working teachers and tell me about using MagicSchool and similar platforms, like Brisk Teaching—I can sympathize with those who reach for whatever helps them survive what often seems like impossible conditions. That said, my sympathy is not my endorsement. The question I keep coming back to isn’t whether individual teachers are wrong to use these tools, but what it means that such tools now exist in the first place, that they’re being marketed as solutions to problems they can’t solve, and what it means to flatten the difference between feeling that circulates through human bodies and relationships and synthetic feeling that circulates through platforms.
My former student’s email reached me through a Google search—algorithms all the way down, amirite? But no platform generated the impulse to look, or the evening spent telling his partner about our CD and book exchanges in freshmen English. That came from somewhere else. And I’d like to think that difference still matters, especially in classrooms, where so much depends on whether feeling is real enough to be trusted. But I do wonder how long we’ll keep noticing it.