Asking Technoskeptical Questions about ChatGPT

Announcements

  1. “Talking Tech” Monthly Meeting this Tuesday, 04/04!: We hold a “Talking Tech” event on the first Tuesday of every month from 8-9pm EST/7-8pm CST/6-7pm MST/5-6pm PST. These Zoom meetings include discussions of current events, new books or articles, and more. Participants can bring topics, articles, or ideas to discuss. Join us this Tuesday on April 4th for our third Talking Tech. Register on our Events page or click here.

  2. AERA Meet-Up: We are planning to hold an in-person Civics of Technology meet-up at the upcoming American Educational Research Association (AERA) Annual Meeting in Chicago on Friday, April 14th, 2023 from 5-7pm CST. Please save the date on your calendar and check back to our blog posts, Events page, and Twitter account for updates on the venue.

by Daniel G. Krutka & Marie K. Heath

While we write this blog post for everyone in our Civics of Technology community, we are writing specifically for the Technology as an Agent of Change (TACTL) Special Interest Group (SIG) of the American Educational Research Association (AERA). The SIG leadership (shout to Lindsay Woodward of Drake University and Jeff Carpenter of Elon University) invited us to facilitate an interactive conversation on ChatGPT at their business meeting to be held on the Sunday morning of April 16th in the Intercontinental Chicago Magnificent Mile Hotel.

As everyone knows by this point, there’s been a lot written about ChatGPT… probably too much. There's probably a thousand blogs with a post about ChatGPT that cleverly says at the end, “this post was written by ChatGPT.” At Civics of Tech, we contributed several posts to the deluge. Autumm Caines challenged the ethics of ChatGPT and offered ways to teach students about ChatGPT without turning over their data by signing up, Chris Clark and Cathryn van Kessel tested ChatGPT’s lesson planning abilities, and Jacob Pleasants and Dan even tested it’s ability to answer our technoethical questions

While technological change has sped up rapidly in recent years, technological change is nothing new. Technology has always disrupted societies and ways of living. In social studies curriculum, technological progress is often intricately linked to social progress. Railroads increased mobility and led to new towns, including ones in which many of us live. However, we contend that this is an unhealthy way to view technological change in a world where capitalists and militarists rarely ask the public whether they would like to live with nuclear weapons, with the constant communication and distraction of smartphones, or in a metaverse. There is lots of talk of the intended effects of technological innovations, and less attention to the unintended, collateral, and disproportionate effects. Railroads led to increased relocation within families. As Sigmund Freud complained in Civilization and Its Discontents (1930), “If there had been no railway to conquer distance, my child would never have left town and I should need no telephone to hear his voice.” Railroads also accelerated the invasion of Indigenous homelands. How often do we challenge students to consider the downsides of railroads?

We cannot say that technology simply results in social progress because that is a value judgment. It is worthy of informed debate. It’s more accurate to say technological progress results in social change. There are trade-offs. Our job as educators and citizens is to determine which changes we want and which we don’t. We recommend that educators challenge students to take technological questions seriously, but how do we do it when technological change—like ChatGPT—arrives so abruptly?

Theory to Practice

In edtech research, scholars often theorize the phenomenon they are studying, but technology itself tends to go undertheorized. This can result in practice that presumes technology to be a tool or intervention in the classroom, divorced from psycho-social and political contexts. Theories of technology help educators consider the embedded nature of technologies, as well as critique their potential downsides.

In the Civics of Technology project, we have consistently used two to two broad theoretical approaches—ecological and critical lenses—which we have found helpful to think about the varied ways technologies affect our lives and our schools. Our shorthand for this is, “technologies are not neutral, and neither are the societies into which they are introduced.”

Taking this theoretical stance is not simply an academic exercise, it helps us better teach about technology. We cannot summarize all the ecological and critical literature that has helped us think about technology, but we offer a short talk as a starting point. In 1998, Neil Postman offered a talk to religious leaders about based on his “thirty years of studying the history of technological change.” We have turned his five points into five technoskeptical questions about technology that you can find on our curriculum page:

  1. What does society give up for the benefits of the technology?: All technological change is a trade-off: While it may seem obvious that there are advantages and disadvantages to any technology, Postman contended that there are technologies which people view as “unmixed blessings” and this creates a “dangerous imbalance.” He argued that “we always pay a price for technology.” Moreover, a new technology can displace older technologies and their benefits, even though some people still prefer the older ones. 

  2. Who is harmed and who benefits from the technology?: Every new technology benefits some and harms others: Put another way, Postman said “there are always winners and losers in technological change.” We detail Postman’s insight by pointing out how differential outcomes can target group identity (e.g., race, religion), organizational type (e.g., small- vs. large-scale business interests), or ideology (e.g., democracy, authoritarianism). 

  3. What does the technology need?: In every technology there is a powerful idea: All technologies carry a bias or belief about the world that impacts people and their lives. Technologies can convey intellectual, emotional, political, sensory, social, or content biases. Postman represented these ideas by quoting the old saying “to a person with a hammer, everything looks like a nail” and referencing Marshall McLuhan’s famous phrase “the medium is the message.” Postman explained that the “telegraphic person values speed, not introspection”; the “television person values immediacy, not history”; and the “computer person values information, not knowledge, certainly not wisdom.” In other words, technologies need humans to think or behave in certain ways to fulfill their function and spread.

  4. What are the unintended or unexpected changes caused by the technology?: Technological change is not additive; it is ecological: Like a drop of dye in water, new technologies are not just additions to the world, they change many other things too. The changes can be hard to predict and impossible to take back. For example, the invention of standardized tests “redefined what we mean by learning, and have resulted in our reorganizing the curriculum to accommodate the tests.” Standardized tests were not simply added to schools; they made schools different.

  5. Why is it difficult to imagine our world without the technology?: Technology tends to become mythic: We get so used to older technologies that we start to see them as part of the natural world. Postman argued we should view technologies we are used to, such as the alphabet (writing) or airplanes, as “a strange intruder.” This means becoming more aware of what technology does to us and for us.

We believe these questions—and the theory that underlies them—can push students to think more critically about technology. For deeper reading, consider Postman’s 1992 book, Technopoly, a work of technological grumpiness. As you might guess, Postman’s strength (like most white men who wrote about technology in the 20th century) was not Black feminist visions of a just world. Thus we recommend that educators use question 2 to delve deeper into recent critical work, particularly by Black women such as Ruha Benjamin (2019), Safiya Noble (2018), Meredith Brousard (2023; book club coming!), Timnit Gebru, and Joy Buolamwini (2017) among others. The Coded Bias (Kentayya, 2020) documentary delves into discriminatory design and is an excellent addition to many edtech classes.

As we have worked to encourage technoskeptical pedagogies, Jacob Pleasants has made the point that students have difficulty engaging in technoskeptical approaches if they are unfamiliar with the technology. For example, many students are better equipped to understand TikTok than tanks because they have spent significant time experiencing the technology. From this base familiarity, educators and students can  inquire into how a technology is designed, what data it captures, and the psycho-social and political effects of the technology. Of course, technology companies intentionally obfuscate this information, hiding practices behind the impenetrable language and length of their terms of service (TOS) agreements. Further complicating an inquiry is that, in many cases of educational technologies, the only place to visit to inquire into the technology is the company’s website. So, what do we make of a technology that seemingly falls from the heavens in which none of us have much experience?

We believe our technoskeptical questions provide one (but certainly not the only) approach that can help educators ask other questions than, will students cheat using ChatGPT? From there, educators have to provide quality sources of information if the technology is unfamiliar or new.  To find more information on a technology often means either using journalistic media sources, sometimes in combination with a level of critical inference about the implications of the technology. However, as we noted above, in the case of ChatGPT, numerous pieces exist which consider the design and implication of ChatGPT on society generally, and education, specifically. 

In the TACTL business meeting we will be applying our technoskeptical questions to ChatGPT, collectively building our familiarity with the technology, and engaging in critical inquiry. We look forward to seeing many of you at the TACTL SIG meeting meeting on April 16th. See you there!


References and Recommendations

Benjamin, R. (2019). Race after technology: Abolitionist tools for the New Jim Code. Polity.

Broussard, M. (2023). More than a Glitch: Confronting Race, Gender, and Ability Bias in Tech. MIT Press.

Buolamwini, J. A. (2017). Gender shades: Intersectional phenotypic and demographic evaluation of face datasets and gender classifiers [Unpublished master’s thesis]. Massachusetts Institute of Technology.

Kentayya, S. (Director). (2020). Coded Bias. 7th Empire Media.

Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York University Press.

Postman, N. (1992). Technopoly: The surrender of culture to technology. Vintage.

Postman, N. (1998, March 28). Five things we need to know about technological change [Address]. Denver, Colorado.

Previous
Previous

ChatGPT and Teacher Education

Next
Next

Quick Bites - Critical Tech Articles, Pods, and Blogs