Prior to (or instead of) using ChatGPT with your students

Civics of Tech announcements:

  1. Second “Talking Tech” Monthly Meeting this Tuesday!: We’ve launched a new monthly event called “Talking Tech” on the first Tuesday of every month from 8-9pm EST/7-8pm CST/6-7pm MST/5-6pm PST. These Zoom meetings include discussions of current events, new books or articles, and more. Participants can bring topics, articles, or ideas to discuss. Our first meeting blew our minds with ideas, action, and community. Join us this Tuesday on March 7th for our second Talking Tech. Register on our Events page or click here.

  2. Book Club Reminder: Our next book club will discuss the 2020 book Data Feminism by Catherine D'Ignazio and Lauren F. Klein on Thursday, March 16th, 2023 from 8-9:30pm EST/7-8:30pm CST/6-7:30pm MST/5-6:30pm PST. The book is available in multiple formats, including audiobook, and a paperback version was just released! Register on our Events page or click here.

This blog was originally posted on January 18, 2023 on Autumm Caines’ blog: Is a Liminal Space. It is cross-posted here with permission. Visit Autumm’s site for more thoughtful posts. You can also listen to Autumm discuss the ideas behind the post on the ChatGPT and Good Intentions episode of the Teaching in Higher Ed podcast.

by Autumm Caines

I have been thinking, reading, and writing a lot about OpenAI’s ChatGPT product over the last month. I’ve been writing from the perspective of instructional design/faculty development/edtech mostly in higher education, though I did dive into a bit of K-12 (which is totally out of my element).

I understand the allure of the tool and the temptation to have students use it. It is new and shiny and everyone is talking about it. It is also scary, and sometimes we can assuage our fears by taking them on directly. 

But I suggested across two other posts that educators might not want to have students directly work with ChatGPT via having them sign up for a free OpenAI account for the following reasons:

  • Student data acquisition by OpenAI

    • Anytime you use a tool that needs an account the company now has an identifier in which they can track your use of the site to your identity

    • You need to provide personally identifiable information like an email/phone number/google account to create your OpenAI account

    • Their terms are quite clear about collecting and using data themselves as well as sharing/selling to third parties

  • Labor Issues 

    • Using ChatGPT is providing free labor to OpenAI in their product development. They are clear about this in their terms and in their faq page.

    • I don’t want to go down the “robots are coming for our jobs” path but many people (including the people building these tools) do envision AI having major impacts on the job market. Is it okay to ask students to help train the very thing that might take opportunities from them? It could be making opportunities too but shouldn’t they understand that? 

    • And I didn’t mention this in the other posts but AI has horrible labor practices exploiting global workers who train these systems. Do we want students to be part of that? Shouldn’t they at least know? 

  • ChatGPT is not a stable release, it could change or go away at any point. It is estimated it costs $3 million USD every month to keep it running. What happens to your assignments if it is down/gone?

    • ChatGPT has been released as a “Research Preview” and no one really knows what that is

      • It might be similar to a “Public Beta” or a “Developer’s Beta” but both of these come with an assumption of a public release which we do not have with ChatGPT

    • It is often down or slow because of the large number of users

    • Features are changing all of the time (for instance chat histories have disappeared and reappeared a few times already)

After suggesting this I got a good bit of push back. “But AI is such a big deal Autumm, and it is going to change the world, and students need to be prepared … and… digital literacy and… and… and…”

I hear you my good intentioned pedagogue. And yet I still have these concerns. So, here are just some ideas of some things you may want to do with your students prior to having them directly use the ChatGPT product with a free OpenAI account – and (I’m kind of hoping) maybe you want to have them do these things instead of using ChatGPT with a free OpenAI account.  

Socially Annotate OpenAI’s privacy and service Terms 

Wouldn’t it be great if students better understood what they were getting themselves into by creating that account with OpenAI? A social annotation activity using a tool like Hypothesis of OpenAI’s privacy policy and terms of service (TOS) can start this understanding. I’ve done this several times out on the open web with various collaborators. TOS and Privacy Policies are dense technical and legal readings so doing it as a group with in line comments really helps. If you can invite a guest annotator who has a background in law or policy great and if not consider having a reading before the annotation about what to look for in a privacy policy and how to read a TOS.  

*Note – This one can be somewhat problematic if your school does not provide a social annotation tool as students likely need to create an account with a social annotation provider who does not have an agreement with your school and that could be the same problem you are trying to avoid. I do feel better about Hypothesis because they are a non-profit but you could also get around this by copying the terms/policy and sharing it in your school supported cloud word processor (Google Docs, MS365, etc) and just using the commenting feature. 

Play the Data, Privacy, and Identity game with your students

Instead of “playing” with ChatGPT (cough, nota toy, cough) in your class you could play the Data, Privacy, and Identity game developed by Jeannie Crowley, Ed Saber, and Kenny Graves. First developed as an in person activity, read Jeannie’s blog post overview of the game. Then check out the resource page where you can read instructions and print off cards. Looking for an online version? Since the team published this with a CC 4.0 license I adapted it into an online version on a simple WordPress site using H5P that requires no login and collects no data. 

Discuss big issues around AI like labor and climate

Have a discussion with students about big issues with AI that are likely to affect them. A good overview of the issues with large language models can be found in Bender, Gebru, McMillian-Major, and Shmitchell’s 2021 paper On the dangers of stochastic parrots: can language models be too big. A discussion of this paper will set you up to dive deeper on the issues.

Impacts of artificial intelligence on labor directly speak to the world of work that students will graduate into. This report from the US-EU Trade and Technology Council about the impact on future workforces can be a starting point. You may want to break it into sections and keep in mind that it is US/EU centric. Follow up (or start with, depending on your context) a more global perspective. You could check out MIT Technology Review’s whole series of articles on AI Colonialism or the recent reporting from Time about OpenAI paying workers in Kenya less than $2 a day for grueling work training the model (you will need a content warning for SA and have to figure out how to get around the paywall for the Time Exclusive but other great articles about this report exist like this one from Chole Xiang on Motherboard).

Large language models like ChatGPT take a lot of computing power to run and all of that electricity has a carbon footprint that we are still trying to figure out how to measure. Discussing this with students helps them to understand these potentials. Maybe start with a discussion around this MIT Tech Review article on how Hugging Face is attempting to better measure things

Conduct a technoethical audit 

If you don’t know about all the resources on the Civics of Technology site you are in for a treat. Here I’m specifically going to recommend their resources around EdTech Audit but the site has a great larger curriculum with all kinds of resources. I’m not sure that ChatGPT is really “EdTech” but if you are thinking of having students use it then you are using it as EdTech. I think the questions, handouts, and examples provided here will serve you in getting your students to analyze some of the implications from the articles and activities listed above. 

Analyze your data collected from other social media platforms

Check out HesitaLabs Digipower Academy. They have several tools, which run in the browser and collect no data, which allow you to examine and better understand the way social media platforms use your data for targeted advertising. It does require that you request a data export from these various platforms but they have instructions for how to do that for each platform. After the tools analyze your data they provide you with dashboards and metrics to help you better understand why you are being targeted the way that you are (because we are all being targeted in some way). Don’t feel comfortable having students download their own data (can they really secure it)? They have sample data you can run too.

Work Through The People’s Guide to AI

What even is an “algorithm”? What is the difference between AI and Machine Learning? The People’s Guide to AI is a workbook helping you to answer these questions. It is filled with relatable descriptions, activities, prompts, and so much more! You could spend the whole term working through this thing!  Written by Mimi Onuoha and Diana Nucera a.k.a. Mother Cyborg, with design and illustration by And Also Too. Licensed CC-NC-SA 4.0 this workbook is also available in print for the affordable price of just $7 USD – and you will want to write in it so paper copies are not a bad idea.

Learning objectives

These are just some of my ideas for activities and assignments. You can come up with your own but perhaps you might consider the following learning objectives (or something like them) to guide you. 

Prior to creating an account with OpenAI students will:

  • Discuss the value that their personal data holds with various actors (themselves, friends/family, school, corporate, government) 

  • Demonstrate an understanding of typical tech product cycles and compare them to non-typical ones

  • Compare how power is held by various actors (themselves, friends/family, school, corporate, government) 

  • Analyze workforce implications of AI at home and globally  

  • Create a personal data security plan 

These are just some ideas, and I’m sure they are flawed in various ways, I’m sure they won’t work for every course, and I’m sure some folks are already doing something similar or even better. But the message I’m trying to send here is just think about some of the larger picture in AI, and have students think about it, before you have students sign up and start “playing” with something they don’t understand.

~~~~

** No ChatGPT was used in composing this post

Previous
Previous

Classroom Chairs Are EdTech, Too!

Next
Next

Speculative Fabulation: Authoring Ourselves Into Stories That Were Not Written For Us