Your Health Data are at Risk, and it Isn’t the First Time

Civics of Technology Announcements

Next Tech Talk: The next Tech Talk will be held on October 7 at 8:00 Eastern Time. Register here or on our events page. Come join our community in an informal discussion about tech, education, the world, whatever is on your mind!

Next Book Club: We’re ready Culpability by Bruce Holsinger. Join us to discuss on Tuesday, October 14th, 2025 at 8pm EST. Be sure to register on our events page!

Latest Book Review: The Mechanic and the Luddite, by Jathan Sadowski (2025).

 

Post by Morgan Banville

On July 30, 2025, The Associated Press reported on the Trump administration’s partnership with Big Technology (Big Tech) to launch a private health tracking system (Brown et al., 2025). While the system is positioned to monitor wellness and labelled as a “helpful” tool to manage conditions such as diabetes, health tracking has a number of surveillant concerns and harmful consequences; amongst them are patients foregoing medical treatment and a violation of basic human rights. The administration is also seeking to, through this system, “Make America Healthy Again”; their vision of “health” in this context is through using Big Tech to force Americans to “manage their weight” as well as implementing “conversational artificial intelligence.” Yep, nothing to be concerned about here.

It isn’t the first time (nor last) health data has been at risk within the United States. For example, the Trump administration gave the United States Immigration and Customs Enforcement (ICE) access to Medicaid recipients’ personal data which included addresses (Kindy and Seitz, 2025), further targeting marginalized and economically disadvantaged folx. And of course, we cannot forget the National HIV/AIDS Surveillance System, which was positioned as crucial for informing the public and preventing disease, but was very much used to target and attack members of the queer community, intravenous drug users, people of color, sex workers, and more. The ACLU commented on the CDC guidelines for the surveillance system in 1999, highlighting the dangers of the name-based HIV-testing program and its adverse consequences.

Though reports have stated that patients will need to “opt in,” to this new health tracking system, Big Tech is notorious for requiring users to jump through hoops to, instead, “opt out.” Fortunately, the Opt Out Project (2022) has outlined some ways the public can opt out of technology. As the group notes, opting out isn’t just about “leaving systems we don’t like -- it’s about opting in to systems and communities we value instead.” The opt in/opt out dilemma is one avenue that instructors might discuss with students to explore the current tensions between Big Tech and surveillance and privacy advocates alike. While such topics certainly impact everyone, I most often discuss such topics in my technical and professional communication courses. So, how can we share this important information with our students?

First, it is important to connect the current event/scenario to the course. Why does it matter, how does it impact the student body, what should they do with this information? Unfortunately, our health data is not just at risk with this new system. Across the globe, there has been an increase in people using chatbots to discuss mental health concerns, leading to significant risks and harm (see Stanford, 2025 and Payne, 2024). Adjacently, Meta announced that they were bringing ads to their WhatsApp platform; the popular messaging app has claimed to keep personal messages private, but the data used for ad targeting could still put user privacy at risk, particularly if any sensitive health data is discussed through text (see Morris, 2025). And we can presume that our students are sharing their health information through text; I am sure many of us have received sensitive information such as doctor’s notes from students (without asking for such, of course).

The world we are residing in has demonstrated that more often than not an individual must be informed to opt out—even then, opting out is challenging based on positionality, privilege, and power (a nod to Walton, Moore, and Jones, 2019). In a forthcoming article in the Journal of Technical Writing and Communication’s special issue on Security Logics, I write with Kimberlyn Harrison about the challenges placed on individuals seeking to opt out of pervasive government surveillance; in particular, we discuss opting out of facial recognition in airports across the United States. In our article, we mention how many people are unsure how to opt out; or believe it is too difficult or inconvenient. Such opt out logics extend to the newly proposed system launched to surveil our “health” and bodies.

For example, in the forthcoming article, Harrison and I write about how for some bodies, it is too difficult to opt out: there is an attack on immigrants, LGBTQIA+, Black, brown, disabled, and indigenous bodies in the United States. For example, U.S. Immigration and Customs Enforcement have been raiding businesses, campuses, and homes across the country, quite literally eroding the border between private and public and thus further stripping immigrants of their rights (forthcoming work). Such invasion, though largely physical, has extended to digital privacy invasion as well. For example, ICE has stated that they have hired an additional contractor to identify and report “previous social media activity which would indicate any additional threats to ICE,” as well as any information indicating that individuals or groups “making threats have a proclivity for violence” and anything “indicating a potential for carrying out a threat” (see Wilkins, 2025 and Biddle, 2025). Essentially: now what is posted on social media can be enough cause for ICE, and even Customs and Border Patrol, to detain, deport, and cause additional suspicion if persons attempt to opt out of systems (see Wong, 2025). And because many of our students are on social media, these current events can influence larger assignments that explore how to handle and/or respond to new surveillant developments.

In terms of our health, everyone is at risk. With the new private system, more than 60 companies, including major tech companies like Google, Amazon, and Apple as well as health care giants like UnitedHealth Group and CVS Health, have agreed to share patient data in the system. Again, without our consent. This information is especially crucial to share with students as we are instructing the next generation of people who will participate in and influence such systems. Further, instructors might discuss opt out logics in educational settings to address the complexities of what it means to opt out, who can opt out, and how. Using the case example of the private health tracking system, or even the airport security example, can be one way to ground some of the terms and complexities for students.

According to The Associated Press article, the weight loss and fitness subscription service Noom, will be able to pull medical records after the system’s expected launch early next year (as in early 2026). Noom will also be able to access Apple Health information. Those with pre-existing conditions, weights that are outside of the range as determined by the outdated BMI system, and additional vulnerable populations will be the most at risk. It may be time to strongly consider (or reconsider) the health technologies that we incorporate into our lives (such as wearable technologies). Using the current events as an example, instructors, especially those in higher education, can use the scenarios provided as a way to discuss consent, surveillance, and advocacy/rights. Because one of our roles as instructors is to teach students how to be civic and global intellectuals and critical thinkers, a lesson that focuses on the dimensions of what it means to opt out in the world would be of interest to students. For example, many of our students show up to class with some sort of device: this may range from an Apple Watch to the Health App on their phones. Creating a lesson or assignment that explores the range of surveillant impacts of health technologies, guides students in exploring topics that are related and relevant to them: from compliance concerns to privacy and surveillance, navigating personal and ethical tensions, design processes, and more. I envision many of these conversations will happen in writing and communication classes, however, such topics are relevant and related to any course or major.

In this health scenario, amongst many others, instructors can create activities with students that will contribute to “opt out logics,” such as through (see forthcoming Banville and Harrison, 2025):

  1. “revising language to use in signage” or communication materials,

  2. “creating scenarios and pedagogical prompts with students,” and

  3. exploring the impacts and consequences for specific groups and people when they exercise their right to opt out.

As Jeffrey Chester at the Center for Digital Democracy stated, “This scheme [the private health system] is an open door for the further use and monetization of sensitive and personal health information.”

And money they will make. 

For more ideas about how to incorporate themes/topics of surveillance in the classroom, see: https://programmaticperspectives.cptsc.org/index.php/jpp/article/view/64


Morgan C. Banville (she, her) is an Assistant Professor of Humanities at Massachusetts Maritime Academy. Her research and teaching are defined by the intersection of technical communication and surveillance studies, often informed by feminist methodologies. She examines how biometric technologies are implemented in diverse contexts, and her research was awarded the 2024 CCCC Outstanding Dissertation Award in Technical Communication, and the 2024 Best Research Article Award from the Council for Programs in Technical and Scientific Communication (CPTSC). You can find her recent work in the Routledge Handbook of Social Justice in Technical & Professional Communication, Surveillance & Society, Programmatic Perspectives, and more.

You can follow Morgan on Bluesky: @banvillemorgan.bsky.social


References not linked:

Walton, Rebecca, Kristen R. Moore, and Natasha N. Jones. 2019. Technical communication after the social justice turn: Building coalitions for action. Routledge.

Next
Next

Project on Open and Evolving Metaliteracies (POEM) Call for Resources