Building Resistance in the Career Development Space

This bonus contribution to our blog reflects and reinforces our conference theme of communal resistance to artificial systems

Join the Resistance on July 31 & Aug 1

Are you registered yet? If not, use this link. On Wednesday (July 30) you’ll receive instructions on how to access the conference sessions. But only if you’ve registered!

Check out our conference schedule! Head over to our conference page to preview the schedule.

And check out our conference preview! In case you missed our blog post a couple weeks back, we gave an overview of our keynote speakers and some of our great sessions.

There’s also a book club! We’re reading Empire of AI, by Karen Hao, and we’ll be holding our book club conversation at the end of our first day of our conference: Thursday, July 31 at 3:00 PM Eastern Time. See more on our events page.


Post Contributors:

Chris Miciek, Director, Center for Career Success, Thomas Jefferson University

Amy Smith, Past President, Eastern Association of Colleges & Employers

Monique Sample, Student Programs & Employer Partnerships Specialist, Virginia Commonwealth University


In alignment with this year’s Civics of Technology Conference theme, we chose to highlight how three career development professionals have actively cultivated communities of resistance and reflection within university career services, spaces often overlooked in broader critiques of technology’s impact on equity and labor. Over the past year, the National Association of Colleges and Employers (NACE), the leading U.S. organization for higher education career services and employer relations, has increasingly positioned itself as a champion of generative AI, offering a stream of monetized workshops and resources that promote its use in career development without meaningful critique. While AI is positioned as an inevitable next step, little space has been carved out to explore its social, ethical, and structural consequences, especially the ways algorithmic tools can reinforce inequities that contradict higher education’s stated commitments to equity and access.

Chris Miciek’s experience in the field illustrates how small but consistent actions can begin to push back against these inevitability narratives. For Chris, it started with a book review. (Well, it started 20 years earlier building the first completely online college career center, but for the story now, it starts with a book review.). Chris has been reflecting on decades of work leading digital initiatives in career services. But the book review of Amy Webb’s The Big Nine, published in the NACE Journal in 2019, became a strategic entry point to start conversations that other avenues, like conference proposals, had long resisted. By sharing ideas through journals, blog posts, webinars, and committee work across both NACE and the Eastern Association of Colleges and Employers (EACE), Chris built relationships and visibility that allowed questions about automation, algorithmic bias, and professional ethics to surface more broadly. This work was not flashy and often unnoticed, but it created enough of a foothold for like‑minded colleagues to find one another. This is how Chris and Monique encountered each other, a chance dinner conversation at the 2023 EACE annual conference. It was in the second half of 2023, fresh off being the only gen-AI skeptic in NACE’s four webinar series on AI, Chris was invited to sit on a panel and create a 3-4 hour workshop for the Metropolitan New York College Career Planning Officers Association on AI. That was when the push for conversations shifted to coalition building. At that point, Chris connected more deeply with Amy Smith who joined him for that panel and workshop, where they in part led colleagues through understanding and asking questions to address three major concerns with gen-AI: bias, privacy, and the question of where, exactly, are the humans? Amy and Chris stressed to their colleagues that not only is it the responsibility of institutions and their leaders to ensure that answers can be provided to all of these above questions, beyond that, it is the responsibility of all to ensure that the answers to these questions are not only upheld, but that the questions continue being asked with each iteration of a system or process. These questions and their combined desire for reform, evolved into ongoing collaborations between the three to design workshops for Young, Smart and Local, NACE’s Annual Conference, NACE’s HBCU & Inclusion Summit and overall collective strategies for workforce development spaces.

Amy often explains that coalition building is only the beginning. Resistance had to become actionable. Like many middle managers, she experienced firsthand how institutions often push new technologies without transparency. Logging into her work email one day, she was met with (and ignored) a ‘Try Copilot!’ prompt. After continuously ignoring it, a few months later, Copilot loaded on launch and could not be removed, any individualized settings greyed out as ‘set by your IT administrator.’ This loss of agency became the catalyst for Amy to craft a set of questions that could guide ethical engagement with AI tools. Again, these questions focused on three key areas, bias, privacy, and the presence of human judgment: “What data was used to inform this system,” “Who built it,” “How will you mitigate bias and discrimination,” “Is your data safe,” “Who is responsible for protecting data, and for what reasons was that decision made,” “Are we doing right by those to whom we are responsible (students, partners, consumers),” and “Is using this system taking away a human experience?”

Amy emphasizes that at a minimum, institutions must be able to answer these questions and continue asking them as technologies evolve. The stakes go beyond operational efficiency; they define whether higher education fulfills its responsibility to students, colleagues, and the public. Ultimately, gen-AI has rapidly become a lens through which organizations and institutions will be viewed. Their answers to these questions will tell how much of their shift towards gen‑AI has been done with intentionality versus a knee‑jerk need to keep up with the technological Joneses. The challenge of resisting and working to change these very systems has often become the focus of conversation between Amy, Chris and Monique.

For Monique, the urgency of the questions Amy, Chris and she pose is grounded in equity. Career services, she observes, are often left out of institutional AI conversations or treated as enthusiastic adopters whose only role is to prepare students for a future of work defined by others. But when the future being built is one we haven’t collectively interrogated and that continues to marginalize the very communities we’re fighting for, then silence and neutrality are not an option. Monique, sitting at the intersection of student identity development, workforce preparation, and institutional culture, has been warning colleagues that career centers, particularly those serving Black, Brown, disabled, first-gen, queer students and any other marginalized group not mentioned, cannot afford to adopt AI tools without examining who these tools were built for and who they exclude on the backend (those doing the building). Scholars like Dr. Safiya Noble have long warned us that technology is not neutral. Her work, Algorithms of Oppression, reveals how search engines and algorithmic systems reinforce racism and sexism. Dr. Ruha Benjamin, through Race After Technology, reminds us that the technologies we call “innovative” often automate the very inequities we say we want to disrupt. When these warnings go unheeded, AI tools can harm the students career services professionals need to support most. Monique has seen it happen: automated resume screeners filtering out community‑based work experience, AI tutors “correcting” dialects rooted in culture, mock interview tools rewarding only narrow expressions of “professionalism.” These systems built under the banner of “efficiency” continue to perpetuate harm, automate inequality, and replicate colonial structures.

Amy, Chris and Monique are clear that they are not anti‑technology or anti‑AI. They are anti‑harm. Monique’s work focuses on building programming and career development experiences that foreground identity and lived experience, ensuring that marginalized voices are not erased in the rush to adopt new tools. When looking at the work that she, Amy and Chris are doing she reflects that they don’t position themselves as tech experts but as stewards of student futures. Faculty may not always see us as educators, but we are, and we are guiding students through some of their most formative and vulnerable moments of identity, purpose, and decision‑making. If we aren’t at the table shaping how AI is integrated with and for our students, we’ll be forced to clean up its consequences, often alone and often with downsized teams because our institutions thought AI technologies could replace human support roles.

Taken together, these experiences show that resistance does not always look like refusal. Sometimes it looks like slowing down, asking better questions, and holding the line until the tools and the institutions catch up with their values. Chris demonstrates how to open conversations even when formal structures close them off. Amy provides a framework for ensuring institutions remain accountable. Monique reminds us that equity must anchor every choice we make. University career centers sit at the nexus of education and work for our students; they cannot afford to be silent. By connecting across campuses and professional associations, we can push back on the inevitability narrative and ensure that technology adoption is thoughtful, ethical, and human‑centered. Career services can be more than a channel for labor‑market compliance. It can be a place where students learn to interrogate systems, reclaim silenced voices, and shape futures on their own terms.

Next
Next

From Fanboying To Thinking - My Journey With EdTech