Holding EdTech Accountable

Civics of Technology Announcements

Next Tech Talk: The next Tech Talk will be held on September 2 at 8:00 Eastern Time. Register on our events page. Come join our community in an informal discussion about tech, education, the world, whatever is on your mind!

Conference Recordings: Missed a session at our 2025 conference? Head to our conference page to see if a recording is available. Please note that not all sessions were recorded. Thank you to all who made this year’s conference a rousing success!

Latest Book Review: The Mechanic and the Luddite, by Jathan Sadowski (2025).


Holding EdTech Accountable: An Interview with the EdTech Law Center’s Andy Liddell


This blog comes from Celeste Barber, a high school student who participated in the Digital Civics Club, introduced recently.

Celeste is a high school student from Austin, Texas. She has a love for the study of politics and how it affects the future for youth. She hopes this interview can give clarification on how education companies influence students’ futures, starting from a very young age all the way to college and job opportunities.

For education researchers:

The EdTech Law Center can use your help! Contact them if you are able to provide:

  • Expert witness services: This is a paid opportunity for education researchers, who are willing to work with Andy and Julie to testify about the harms of educational technologies in court. This is a great way to put your research into practice.

  • Research: Andy and Julie need to be able to submit research to court to back their claims. They are particularly interested in research that demonstrates educational technologies’ developmental harms to children, or that shows the advantages of de-adoption. 

Scroll to the last question for more context or reach out to us (Civics of Tech) or them (EdTech Law Center) if you are interested in supporting their important legal work!

Celeste: Well, welcome. It's great to meet you. I'm Celeste. I'm part of Ms. Thrall's Digital Civics Club. Not only that, but I've been interested in your work recently. So, can you tell us a little about the EdTech Law Center, just to start with?

Andy: Well, thank you for having me, and it's exciting to meet a young person interested in this stuff, so just thanks for the opportunity to talk with you. 

I'm Andy Liddell with the EdTech Law Center. My wife, Julie, and I founded this law firm to solve problems where school meets technology. We've been activists in this area for a long time, since 2018, really. I can go into our backgrounds, but at a top level, I've been a federal court litigator my entire career. My wife was an appellate attorney until she was asked to join a project in the state of Texas around improving the access of mentally ill people to the judicial system.

It was around 2017, when she was with the Judicial Commission of Mental Health when we were being exposed, just through her work, to the early studies showing links between the more time kids spend on phones, the worse off their mental health is. And it was correlative; it was early, but we were new parents. And so, we started paying attention to these issues way earlier than most people did, just through happenstance.

We really got started doing this work when a first-grader in a neighboring school district was being repeatedly exposed to pornography on his school-issued iPad at school. This was on the school network during the school day, in his first-grade class. There was a big uproar among parents, and then backlash from the district against the family. They asked for our help in negotiations with the district. We were shocked that this is even possible because, first, what does a first-grader need an iPad for in class, anyway? And second, if they're going to have one, shouldn't it be safe?

And so, we tried to help this family in their talks with their school district. I reached out to a bunch of activist organizations, the ACLU, just to try to get them a lawyer. This was not the type of case I could take on, but the school district  was essentially trying to force this family out, and ultimately did. They succeeded. Through this work and talking to this family and community, I understood that things like this were happening very frequently – that the technology that was being deployed in schools was not being done safely or with the kids' best interest in mind. And then, any squeaky wheels were being kicked out of the district. 

I became an activist around these issues beginning in 2019 with a group called Fairplay. It's a nationwide grassroots organization, and we've done a lot of great work. We have had a ton of success this year in getting some phone policy laws passed and opposing the AI regulation moratorium that was just killed in the dead of night last night, which is a huge victory for us. But it wasn't happening fast enough. I kept seeing the exact same stories play out and kids getting hurt in real material ways by the technology at their school, on top of the privacy violations, on top of the erosion of the quality of their education. And so, Julie and I decided to start the EdTech Law Center to hold the truly responsible parties accountable, which are the tech companies.

Celeste: You talk about the educational side of technology and how it can expose kids to things that they don't want to see, such as pornography. I was wondering why you would attack the educational side of it? As I understand, there has been a lot of content in general out there that is just harmful for kids. So what would make the educational side more valuable?

Andy: So it's hard to go after content alone because we have very strong First Amendment protections in the United States. And then on top of that, the technology industry has enjoyed this immunity from accountability that others don't have from Section 230 of the Communications Decency Act. They are less accountable than other normal non-internet businesses for the speech they host on their platforms. It's very difficult to sue a tech company for publishing what's protected speech. Even if it's awful speech, it can be protected. And so, that is the difference, and again, not to say that people haven't tried, there are these great social media lawsuits that are going on. The Social Media Victims Law Center is doing amazing work, but they're not going after their speech necessarily. They're going after product features and product design and how these things are designed to foist content on the people.

School is different because the terrain is so much more favorable. You have to go to school, and your school has an obligation to keep you safe. 

Celeste: So, it's more regulated in educational spaces than in more public media spaces?

Andy: Exactly. So from a legal perspective, it's a lot more favorable terrain to take these things on, and I think it's every bit as problematic in the social media space as it is at school. It's just easier to find accountability when these companies are selling into schools.

Celeste: I'm wondering, it's all very commercialized for these technology companies integrating into government school systems. So in your website you mentioned how you want to go and limit the commercialization in students' lives. So how would you do that effectively?

Andy: So when I say commercialization of students' lives, I really mean that every interaction that a kid has at school through their computer is mediated by another company for that company's profit. And that's really dangerous because their need to make money is not aligned with the student's need to learn or a teacher's need to teach. Having that misalignment is really at the root of these problems around edtech ‌companies. It'd be one thing if they could compete for who legitimately helped kids learn the best, or who legitimately kept them safest online. That's really not how they do it. They compete on who can capture the biggest share of the pie, who can lock themselves into school districts, and who can integrate themselves into the teaching process.

Celeste: In your legal cases, are you looking for more of the companies to be completely out of education or do you want it to be more regulated? 

Andy: From a philosophical standpoint and in line with emerging research, I think we need to have a lot less instruction through computers. Computers are good at things only computers can do, but the things only computers can do that are relevant to education are pretty limited. Basically allowing kids in analog space to build up the fundamentals through reading books and working together in the physical world, I think, will get us most of the way there. Kids need to learn about computers, and Civics of Technology is helping to do that.  I do think that there is room for education about computers, but that's not education with computers necessarily. It's really educational about how these things work: What is the internet? What does that even mean when we talk about the internet? What is the business model of the internet? What is surveillance capitalism? How does that work? What are the incentives under that paradigm to manipulate user behavior and to trick people?

So all that to say, a lot less computers in the classroom. There obviously is a world where the back end stuff still can be computerized, but that's it. And so, the first cases that we've brought are these data privacy cases where we're suing companies for taking kids' information without their consent. And I think most people think that if there is any edtech company involved on the administrative side of a school district, they're basically a digital filing cabinet, that all they do are the Fort Knox for everybody's records. They go, they keep them there, they keep them safe, they keep them segregated, they don't do anything else with that information.

That is the opposite of what's happening, that these companies hold this information, claim rights to do whatever they want with it. They're running unregulated experiments with it. They're making predictions about student behavior with it. They're selling that information to marketers, even if they're marketers at colleges, they're still marketers. And so, there's just this very unfair and deceptive industry that has arisen from student records management, which is not a place that most people would think to look for evildoers, but they're there.

Celeste: You talk about how when they're collecting data, it's almost like they're making a profile of the student to marketers so that they can tailor whatever advertisement to the student. On your website, you mentioned the “cradle to career” pipeline. How do companies try to pass the collection of the data as a net positive for students?

Andy: Well, it's rare that they're talking to students. They're usually talking to people in administrative capacities or state agencies, colleges, people who have to make many decisions at a very high level affecting a lot of people. Your school's the one that decides what student information system is used or what college application platform is used. When they do talk about it and try to at least pitch to teachers and occasionally to students, they talk about the benefits of personalized education. And if we know anything about personalization, it's really just shorthand for surveillance — that all they're doing is looking at every single thing that you do on your computer and using that to make predictions about your behavior.

It's unfair, and it's wrong. It has the danger of being incorrect as well. If you get flagged as having a behavioral problem that you don't have or being likely to commit a crime that you're not likely to commit, or even if they scoop up the wrong test score, the wrong ACT score or SAT score, which happens, then everything you've worked for, everything you've done could be for naught. And the idea that you have no right to even know this information exists, to know what it is about you, much less how it's being used. Well, it's not legal, it's not lawful, and we're challenging those practices in court, but more to the point, it's not right. It's really offensive.

Celeste: What I'm getting is that a lot of these companies are trying to prevent students from looking at what's really going to influence them in their future lives in college and jobs. I remember reading about the prison system and how companies can “predict” whether a person is more likely to go to jail, which can influence certain opportunities provided. So, is it interconnected? To what extent do these educational companies connect to other big institutes outside of the school system? 

Andy: I think you're asking to what extent ‌the edtech ecosystem interacts with the rest of the world. My understanding is completely – that these prediction analytics make their way out into the wider world. And if it's "only to marketers at colleges or only to recruiters or only to workforce planners or only to your local government,"‌ that's still a lot of people making really important decisions about you. That is problematic. I think it is likely that it's going much further than that.

There's a great organization called the Internet Safety Labs, and they did a study where they surveyed 600 school districts and looked at all the edtech products that they used, traced the data flow in every one of them and found that 96% share student personal information with commercial third parties. There is a whole industry involved in this. There are data brokers, and then there's also companies called identity resolution companies. Their whole thing is to take anonymized information and de-anonymize it, basically to take these little bits about you that are floating around the internet and tie them back to you, ostensibly for marketing. We know that information can be bought and sold by and to anybody.

And so, it's not just marketers trying to sell you a cool mattress in a box. It's all kinds of people who are making all kinds of decisions. I'm quite confident, I don't have the evidence of it yet, but just knowing how these companies work, I would be much more surprised if none of this is making its way into that ecosystem.

Celeste: I remember reading on your website that you prioritize the idea of parents having control over what their kids consume and making sure it aligns with the values of the parents. So, to what extent do you think parents should have control over what kids see in the classroom?

Andy: It's a really interesting question. I think right now, especially, it's a hot button issue, because you see families pushing back on content like educational curriculum that doesn't align with their own family values. 

I view my work as primarily for youth digital rights – that your civil rights are being invaded at school by being manipulated for corporate profit, by being spied on all the time by your school. It's less about parent control over what book you might read or what the school might assign; it’s more about not wanting my kid, at any age, going to school and looking at porn. Nor do I want my kid going to school and looking at YouTube, even if it's totally benign stuff. I don't like how it affects people's attention. I don't like how it profits from monetizing people's attention. I'm morally opposed to using YouTube for entertainment, so I don't want my kid watching that.

I do think parents can have some say, especially when these companies are collecting information about kids that will be used against them later. But I totally take the point of your question, which is that there is a limit to that. And I think that young people, as you go from being a child in elementary school to a young adult, you have greater and greater rights and self-determination. Your mom and dad shouldn't be able to come in and prevent you from learning things that are against their values. 

Celeste: You talked about how a first-grader is exposed to pornography on school grounds, through the school Wi-Fi, through a school-issued iPad. I'm wondering how educational companies allow it to get into the hands of a first-grader in the first place. How does that even work?

Andy: We've actually filed some new lawsuits that aren't even up on the website yet, but they're product defect lawsuits. Essentially, what we're saying is, in this case, Google, the intended audience for this device is a child in elementary school, and there is an obligation to design products that are safe for that child. What is happening is that they're not doing that. They have products that aren't selling or not selling like they want them to, with examples being the iPad in 2012 and the Chromebook in 2011. They didn't quite take off. So, they were suddenly slotted into the educational market by clever people who work at these companies.

They didn't sit down with a blank page and say, "Gosh, what would a five-year-old need from a computer? What would really make them safe and help them learn what would be appropriate for them developmentally? How would that look different from a third-grader? How would that look different from an eighth grader? How would that look different from someone in high school?" They didn't do any of that. They had a bunch of computers that didn't sell and then dumped them on schools and said, "Good luck, guys. Figure it out."

And so, that's really the problem. It is your obligation to ensure the safety of these things for these little kids. Outsourcing that responsibility is not an option. You can't put it on schools, you can't put it on parents, you can't put it on six-year-olds. You need to make these things safe, like everything else that goes into a school.

If you really take a step back and think about how much every single thing, every object, every physical part of a school is designed for the age of the people who are using it and experiencing it, and then how that's just not the case at all for technology. It's really shocking. Why are we applying that double standard? It's just that no one's thought of fighting it yet. 

The analogies that I use are, if you go into a kindergarten classroom or if you go into a senior AP chemistry lab, those things look completely different. The physical environments are completely different. A chem lab is very appropriate for a high school junior or senior who's been trained and who's being supervised, and who's being taught how to use that environment safely. But, it would be inappropriate to just turn the kindergarten class loose in there and then just let them go nuts with the Bunsen burners, sharp glass and toxic chemicals.

What's happening now with computers is everybody is getting the chemistry lab, and the companies are saying, "Well, it's up to you all to make it safe. It's not on us. We gave you the labs, it's yours." And now you have to spend all this time and money and resources to make it safe. If a company delivered a playground for kids to assemble themselves and the first grade class went out there and built the slide and it collapsed and killed them; we wouldn't hold the kids responsible. We'd say, "Well, what are you doing? This is not a reasonable thing to do." But when companies tell kids, "Hey, be internet awesome. Keep yourself safe online." That's literally what Google tells kids. It's totally outrageous. We're basically here to end the double standard that's allowed these companies to get off scot-free for going on 15 years now.

Celeste: I'm thinking of algorithms. You talk about how you don't exactly know how it works, because I'm guessing the company won't disclose the information. I'm wondering if you knew with the example, saying that if schools knew exactly how these algorithms would work, would your focus shift more on making sure that these algorithms are more regulated, or would you continue your focus to be more like, hey, we have these certain data breaches, we have these certain cases where students' rights are being undermined. Would you attack the algorithmic side or would you continue to attack the side where it's a person by person case?

Andy: It's all of a piece. The data business model, the surveillance capitalism in my mind, is the heart of the problem because companies make money by collecting data on people and then finding ways to monetize that data. So they only ever want more and more data, and their products are only ever designed to do that. These predictions are an outgrowth of that. Having programs and algorithms that are designed to optimize for engagement and not time on task, those are also outgrowths of those algorithms that just push kids from an innocent search about Pokemon to hardcore pornography algorithmically. They're all an outgrowth of this business model.

Legally, we have significantly more privacy rights than rights related to unfair competition, particularly in schools, because it's argued that we didn't purchase the item; the school did. And so, they have a claim about deceptive trade practices. And so, that's where it, from a legal standpoint, is a little harder. But I do think regulation is necessary here, period. I mean, we need to have total clarity about what these companies are taking and what they're doing with it, and we just don't. In the absence of regulation, I'm doing literally the next best thing to regulation is these class-action lawsuits, and it's a very poor substitute.

Regulation by lawsuit is the worst way to regulate except for no regulation at all, because you want informed policymakers. You want these things to be disclosed and batted around. You see this repeatedly, and we looked at this as we were starting our law firm. If you look back through history, tobacco and automobiles and asbestos and OxyContin, all these things, they follow the same trajectory, which is they come on strong, the industry gets built up strongly. They promote the false myth of, “Well, you can't make a safe car or cigarettes. Those things are cool. They're not going to hurt you. How else are you going to keep your buildings cold? Isn't this pain medication working really well?” And they flood the market with these positive messages while hiding what they know usually are very harmful side effects.

Lawyers like me get a hold of it, and we start suing these companies, holding them accountable under the same laws that we've had for a long time. And then, eventually, regulation happens. Eventually, they realize after years and years of fighting and losing these lawsuits, maybe regulation isn't so bad because it's at least predictable. Lawsuits are not predictable. You can't really plan for them. You can't understand when they're going to come on. When they will be brought remains unknown. There's a reason that our logo is the black swan. You can't see us coming. You can't predict it. And so, I would tend to agree with everyone who says regulation by class action is a bad way to go, but it's better than nothing. This is how change has started for going on 80 years in this country, through lawsuits like ours.

Celeste: Continuing to talk about your practice and your activism through lawsuits and others, how can the community of Civics of Tech help you and support you?

Andy: Oh, man. I mean, this is wonderful. Just having a platform, being able to talk to you all, I love the framing of Civics of Technology. I really think this is a huge missing piece in our democracy. Understanding how these tools work and understanding what they are. Understanding how a handful of companies in Silicon Valley have accrued so much power and have come to mediate all our lives in such a profound way. I think continuing to have these conversations, continuing to speak out and speak up on behalf of yourselves. I am moved, honestly, by young people taking this on themselves because my kids are little. Well, they're not that little. They're 10 and 8, but this is affecting them too. When I see young people in high school taking this on, taking it seriously, it's so powerful. 

For researchers in the Civics of Tech community, who would like to support our work, we will need expert witnesses for our lawsuits. Whether or not you're in the United States, if you are a person especially versed in privacy issues arising from internet and technology use at school, please reach out. That would be something that would be very helpful. Experts in lawsuits are paid for their time. This would not be free work. It's good work, it's exciting, it's interesting, and it's a way to put your research into practice as long as you don't mind going up on the stand and being beat up by the bad guys for a little while. So that's one way that would be immediately materially helpful to us.

Also, thinking about anyone who's studied the harms developmentally—what happens when a kid goes and watches 13,000 YouTube videos in 3 months instead of learning  — would be very helpful. I mean, that sounds like a crazy thing, but this is a lot of my cases. We know it's bad, but having an expert say why it's bad and how it's bad‌ could also be very helpful. 

From more of a policy perspective, maybe putting my activism hat back on, we have seen the de-adoption from places that are backing out of this edtech. So I think most notably of Sweden, who was a very early adopter to edtech, and then in 2023 announced that they're not doing it anymore. It would be wonderful to find those studies, have them in English, and basically have a white paper about what they did. This is how they realized it was wrong. This is how they built consensus to understand exactly why it was wrong. Here are the changes that they made, and here's how you can follow that same blueprint here. That would be really interesting.

It would also be really interesting to understand the ESSER [Elementary and Secondary School Emergency Relief] funds and how those were used to hook schools on technology. I would love to see that kind of research. And then, the ESSER cliff, I would love to know  — because one thing you hear often from school districts is, “We can pass a bond for technology, we can't pass a bond for teachers”. They talk about this as a funding issue. Understand what are ‌the opportunities for funding reform then? Is this really truly something that just can't be helped? If we look at history, we understand that automation only ever means de-professionalization and disempowerment. Teachers are professionals. I want teachers to be secure and well-paid. I want a lot of teachers for smaller class sizes. That's vibrant, that's good. That's the country I want to live in. Not only do I think it's morally right but also I think it's the key to an actually educated, healthy, happy populace.

When we're trying to automate the work of human relationships for someone else's profit, it's a really awful system to find ourselves in. A workaround and understanding of not only the mechanisms of how tech spending is overshadowing spending on people. What's lost when we're trying to use these surveillance programs to prevent school shootings. I think a teacher with a really close relationship with his or her students could do just as good‌ a job if they had the time and the resources for genuine human connection. One might think that if that child had that deep connection at school, then they might be less inclined to do that sort of act.


Previous
Previous

Designing Civic Learning for AI Justice

Next
Next

What Walter Ong Can Teach us in our Technological Moment