Career Services Working Group Addresses AI at Colorado Boulder

January 16, 2024 | By Kevin Gray

Artificial Intelligence
A young woman looks at a series of charts superimposed on a screen.

TAGS: ai, best practices, career development, nace insights, technology,

When Dylan Mark returned to the University of Colorado Boulder Career Services Office at the conclusion of paternity leave, AI had gained a solid foothold in campus life.

“I felt like I was coming out from underneath a rock,” recalls Mark, the professional experiences program manager in the university’s career services office.

“Finding out about the power of ChatGPT and the ways it was being deployed on campuses and throughout the country was mind-blowing, and I'm someone who’s always interested in tech and data and how all that works. It was fascinating to see how quickly ChatGPT took off and I immediately found uses for it in my personal life.”

Mark also envisioned uses for generative AI in his work with college students. To share his experience and learn from those of others, he suggested that the career services office develop an Artificial Intelligence Working Group.

“We needed to keep up with the trends of where it was going because it was taking off so fast,” Mark explains.

“You could see its rapid progression and wide applications. It made sense to have a team of us looking at it and assessing where are we; if we’re keeping up with it; how it’s impacting our office, our students, and the workplace; and how it could impact us going forward.”

The working group is composed of up to 10 staff members at different levels from Colorado Boulder’s career services office. They meet to review AI in terms of current trends and how it’s being discussed and addressed in the career services space, in the professional workplace, and more.

“It’s important to talk about AI from different lenses,” Mark explains.

“Our working group has members from our employer team and our student advising team. I'm on our internship side and we have leadership in our office involved, so we have folks representing basically all corners and facets of our office to provide their own input and experience.”

The group meets several times a semester. The work it has done includes:

  • Soliciting input from campus faculty or staff who are experts in AI;
  • Conducting research online on how AI could be applied to career development;
  • Informing the broader office about their discussions and findings;
  • Sharing AI professional development opportunities and helpful resources; and
  • Writing an article about the effective use of AI.

“We’ve had discussions with several experts in AI, and we've also done some reading on the topic, and one of the things that seems to be a theme is that if you are not using AI correctly and effectively, then you're falling behind,” Mark notes.

“The people who do use it will be advanced among their coworkers and in their careers because of what they will be able to accomplish in a short amount of time.”

For example, Mark’s intern—who he points out is an “extremely brilliant” computer science major—was not well-versed in Excel. Together, they were working on a project for the career services office that involved pulling information and data to create a more seamless report.  

“I have some Excel skills, but I'm not equipped to do more challenging equations, coding, and formulas,” Mark admits.

“[The intern] went into ChatGPT and, within 20 minutes, produced a formula that could carry the kind of work that would have taken us many, many hours to do. It was an eye-opening moment of realization that harnessing this technology is a skill for the future, even if you're not actively using it every moment of your day. Having that kind of skill going forward is going to be critical for students professionally because it will elevate their efficiency.”

Even in the classroom, some professors at colleges and universities across the country encourage students to use AI to produce results and then critically analyze the results that are produced.

“The critical analysis and communication skills still come into play as students are encouraged to use AI because these professors see value in having access to this tool, whereas other professors are banning it from courses,” Mark says.

“This isn’t going away, so there are benefits to teaching students to leverage AI, but to remain authentic and true to themselves.”

To do so, Mark envisions students plugging information into an AI tool and critically analyzing the output and formulating it into their own words.

He says: “I do think that is important to use it, because the more you use it, the better you will become at using it and the more efficient you will become in transforming it to be your own creative and productive tool. This is the way work will be performed in the future, so students who learn how to leverage AI now will have the advantage when they enter the workplace.”

Mark cites authenticity, along with privacy, ethical concerns, and accuracy as the biggest challenges at this early juncture in the evolution and adaptation of AI.

“How do you use an AI tool in a way that still portrays your authentic human self?” he asks.

“And how do you use it in an ethical way? You could apply to 30 jobs in an hour if you used AI to write all of the cover letters, but there are limitations and pitfalls to avoid, such as typos, incorrect spelling, and false narratives incorporated into a response. How do we communicate to students that this is a tool we want you to use, but also that we're concerned about your use of it? That has been one of our challenges.”

Last spring, Colorado Boulder career services informally experimented to see if the college community could identify a cover letter written by a human versus one generated by ChatGPT. A student created three cover letters: one completely written by him, a second fully generated by ChatGPT, and a hybrid one combining the cover letter he wrote with elements of the one created by ChatGPT.

Mark and his career services colleague sent the letters out to people on campus, including some in the career community, to see if they could pick out the one fully written by the student.

“When people voted on which was the human-created cover letter, there was a little bit of a lean toward the real one, but it was surprisingly close to a one-third split on all the versions,” he says.

“A lot of people were thrown off. Ironically, the ChatGPT version had some more typos, but maybe that lent itself to people believing that it might have been his because he was a strong writer.”

Mark has several recommendations for addressing AI, including:  

  • Realizing that this is not just a fad—Although it can seem daunting because it’s so new and is moving so rapidly, AI is a technology that will not be going away. It is best to address and approach it strategically, incorporating a range of viewpoints and functions into the process to maximize it capabilities in on-campus work and for students entering the workforce.
  • Being careful about what you share—Privacy is a very real concern with AI. If you are going to use it for resumes, limit the types of personal information included and the way it is included because you don’t want to, for instance, give ChatGPT information you don't want shared publicly.

“I also encourage others to explore AI’s capabilities,” Mark adds.

“AI doesn't only show up only in generative AI like ChatGPT. It shows up in many ways, such as within resume-building tools, tools that combine regular photos and selfies to create professional portraits, and many more. What we can imagine today will soon be reality, so be ready for it.”

blank default headshot of a user Kevin Gray is an associate editor at NACE. He can be reached at kgray@naceweb.org.

MLI 2025 NACE's 2025 HBCU & Inclusion Summit NACE25 2025 NACE Awards

NACE JOBWIRE