Conduct Technology Audits
Making decisions about educational technologies is not a neutral or purely technical task. Technologies shape classroom practices, student relationships, and forms of participation, often in ways that are opaque to educators and students alike. Conducting an edtech audit provides a structured way to surface the ethical dimensions of these tools, including their collateral, unintended, and disproportionate effects on learners, communities, and democratic life. Drawing on four analytic approaches developed through the Civics of Technology project, educators and students can ask disciplined, critical questions that move beyond whether a tool “works” to whether it aligns with their educational values and responsibilities. An audit supports informed judgment about whether to adopt a technology as designed, modify its settings or uses, or reject it altogether. Importantly, this process also positions teachers and students as civic actors who can advocate for more responsible technology practices within classrooms, schools, districts, and communities.
Steps for Conducting a Tech Audit
Choose a technology audit approach from below.
Determine whether students will complete this activity as individuals or in small groups. If students need support in research skills, teachers might conduct a “think aloud” where they model the research steps below.
Teach students how to identify credible sources for conducting their audit. We recommend the SIFT approach in the “Research Instructions” below, but there are many approaches to teaching students to conduct research online.
Students should then complete the audit by filling in a worksheet below and prepare to share their findings with the larger class.
Students should then participate in a discussions as to whether—or under what conditions—the technology is ethical to use. Our worksheets use elements of both a Structured Academic Controversy (SAC) to encourage students to dig deeper and learn more and an Inquiry Design Model (IDM) to encourage students to answer a compelling question, communicate conclusions, and take informed action.
Students should determine what actions they will take as individuals (e.g., will they continue using this technology or seek out an alternative?) and as member of a community (e.g., should our class, school, or district continue using this technology). Note: Sometimes, it can be hard to imagine abandoning popular technologies so students might learn about teens who created a “Luddite Club” or read about one person’s attempt to leave U.S. tech.
-
Technoethical Audit
One set of questions for conducting a technology audit comes from a Krutka, Heath, and Staudt Willet (2019) article. We have simplified the questions here. Students and teachers may adapt questions as needed.
How is the environment affected by this technology?
How does the design of this technology impact people?
What are the company’s business practices (ex: labor, profits)?
What laws/policies apply to this technology?
What are the intended effects of this technology?
What are the unintended, unobvious, or disproportionate effects of this technology?
Is the creation, design, and use of this technology just, particularly for minoritized or vulnerable groups?
In what ways does this technology encourage and discourage learning?
How would your experience change if you did not use this technology?
Considering your answers above, should we use this technology? If not, what are the alternatives?
-
Discriminatory Design Audit
The following four discriminatory design audit questions below were adapted out of Ruha Benjamin’s 2019 book, Race After Technology: Abolitionist Tools for the New Jim Code. We recommend reading this book or watching documentaries like Coded Bias (2020) for examples of discriminatory design. Krutka, Seitz, and Hadi (2020) conducted this example audit of Zoom at the beginning of the pandemic as schools turned to the service for remote teaching and learning. Teachers and students may use, modify, and adapt these questions as needed.
Are social biases engineered into the technology?
Do default settings allow for discrimination against more vulnerable groups?
Does the technology recognize or treat groups differently in ways that cause disproportionate harm to vulnerable groups?
Does the technology reinforce social biases even though it purports to fix problems?
-
Five Critical Questions About Tech
Humans tend to be optimistic about technologies because immediate benefits are often obvious. These five critical questions about technology can be used for critically inquiring into the collateral, unintended, and disproportionate effects of technologies, including educational technologies.
These questions were adapted by Dan Krutka and Scott Metzger from a 1998 talk referenced below by Neil Postman. You can find more infrormation about these questions on the Curriculum page of this site. Teachers and students may use, modify, and adapt these questions for their own contexts.
What does society give up for the benefits of this technology?
Who is harmed and who benefits from this technology?
What does this technology need?
What are the unintended or unexpected changes caused by this technology?
Why is it difficult to imagine our world without this technology?
-
Baldwin Test
The Center on Privacy and Technology at Georgetown Law uses the Baldwin Test, named after James Baldwin, to encourage transparency and something closer to truth when describing technology. It includes 4 commitments:
1) Be as specific as possible about what the technology in question is and how it works.
2) Identify any obstacles to our own understanding of a technology that result from failures of corporate or government transparency.
3) Name the corporations responsible for creating and spreading the technological product.
4) Attribute agency to the human actors building and using the technology, never to the technology itself.Charles Logan added 3 elements:
5) Name the technology’s theory (or theories) of learning.
6) Describe the technology’s effects on pedagogy.
7) Highlight the technology’s impacts on the environment.Students could then write a transparent press release or annotate an educational technology company’s press release or website. You can read more explanation and examples from Charles in this blog post.
Research Instructions
When you use a search engine to research a topic, students can apply the SIFT method to decide whether a website or news story is credible.
Stop before clicking or trusting the result. Look closely at the site name, headline, and summary. Ask whether you recognize the source and whether the claim seems designed to provoke a reaction.
Investigate the source by opening a new tab and searching the organization or author to see what reliable third-party sources say about them. Wikipedia can be a quality source for learning more about organizations or people. Do not rely on the site’s “About” page alone.
Find better coverage by searching the same topic using different keywords and comparing how multiple reputable outlets report the information. Credible claims usually appear across several trustworthy sources.
Trace claims and evidence by clicking through links to original studies, documents, or data and checking that quotes, statistics, and images are presented in context. If you cannot confirm who produced the information, how it was reported elsewhere, or where the evidence comes from, do not use the source for academic research.
Students might consider even using Google’s “AI Mode” (not Google AI Overviews) to start their search. This will help generate initial answers to audit questions a bit quicker, which can ensure students don’t get stuck on the initial part of the search process. Students can start their search by writing a neutral query that prioritizes expertise (e.g., “What are environmental concerns about ChatGPT according to experts?”). They then can read the generated response, click on links to determine source credibility, and then choose 2-3 sites to source their answer in the audit.
Tech Audit Examples
-

Coded Bias documentary
Coded Bias is a U.S. documentary film directed by Shalini Kantayya that includes Dr. Joy Buolamwini’s more mathematical audit of racial bias in facial recognition systems.
-

Unmasking AI book
In Dr. Buolamwini’s book she describes both an algorithmic (see Coded Bias) and evocative audit, which is “an approach to humanizing the negative impacts that result from algorithmic systems” (p. 2).
-

Asking Technoskeptical Questions About ChatGPT
Technology and education scholars collectively apply a technoskeptical audit to ChatGPT.
-

Conducting a Technoethical Audit of ChatGPT
Drs. Logan and Vakil share how they used a technoethical audit in their course. They generously include their resources!
-

Foregrounding technoethics
Introduces technoethical audit in discussion of the recently introduced Teacher Educator Technology Competencies (TETCs).
-

Example of technoethical audit of Google Classroom.
Evaluates Google Classroom and Google Meet with a technoethical audit.
-

Don't be evil: Should we use Google in schools?
Example of technoethical audit of the range of Google services.
-

A discriminatory design technology audit.
Example of discriminatory design audit of Zoom from the beginning of the COVID-19 pandemic.
-

See Results Anyway: Auditing Social Media as Educational Technology
Evaluates social media using a combination of technoskeptical and discriminatory design audits.
-

Uncovering the Hidden Curriculum in Generative AI: A Reflective Technology Audit
Adapts questions of hidden curriculum (Apple, 1975) to audit LLMs in education.