AI generated photo of current and former classrooms

AI-generated image based on prompts “diverse group of college students sitting in classroom writing” and “students sitting in classroom of the future writing,” blended in Photoshop. Credit: Microsoft Bing Image Creator, prompted by Everett Hogrefe and Jayme DeLoss  


What does AI mean for higher ed? It’s complicated

story by Jayme DeLoss
published Aug. 31, 2023

How will AI affect the way we learn and the way we interact with one another, and how can it be used in the classroom to benefit students? These are questions educators everywhere are grappling with – along with concerns about students relying too heavily on tools like generative AI. In the face-off between artificial intelligence and academic integrity, which “AI” will come out ahead, or is it possible to have both?

As CSU’s director of academic integrity, Joseph Brown has spent a lot of time thinking about how AI will impact higher education.

“Students are beginning to integrate AI into every facet of their lives, and of course, that includes how they learn,” he said. “The conversation among higher education institutions is moving toward, ‘How can we harness what is clearly a monumental technology to accentuate and improve student learning?’”

Brown is based in The Institute for Learning and Teaching, called TILT for short, which has put together an “AI Survival Tookit” to help guide faculty on how they might handle AI in their classrooms. It provides information on the issue and tips for how to talk to students about it.

Brown said many students may be uncertain about when it’s OK to use generative AI technology for class, which is why he recommends that faculty set expectations around AI use at the beginning of the semester.

“If we’re clear about what our expectations are, I’m hopeful that the majority of our students will step back from using it in inappropriate ways,” he said.

While making sure student work is authentic is important to the University’s mission, policing student assignments generated by programs like ChatGPT has proven difficult.

“The output from GPT-4, the most advanced version, is undetectable by plagiarism detection software,” Brown said. “However, there are common-sense things that we have noticed when students use it in their writing, such as so-called hallucinations, or moments where the program gives you content that we know is demonstrably false.”

Brown said norms and expectations for how student work should be completed are changing with this new technology, and he expects they will be very different in just a few years.

“What we need to do is figure out how we can coexist with this technology without losing the core value of academic integrity.”


Same old problem, shiny new technology

The ethical questions raised by AI are not new, observes Matt Hickey, professor, University Distinguished Teaching Scholar and associate dean for research and graduate studies in the College of Health and Human Sciences. In fact, he points out, the technology isn’t even new. Artificial intelligence has been around in some form since the 1950s.

“The issue of integrity about passing off work that’s not my intellectual property is not new,” Hickey said. “That’s an eternal challenge for higher education. ChatGPT makes it a bit more interesting, but it’s the same old problem wearing a new outfit of shiny technology.”

Hickey, who has moderated two installations of the Provost’s Ethics Colloquium on the subject over the past year, suggests that the way we respond to the questions and problems brought about by AI also doesn’t need to be completely novel. “The ethics perhaps don’t need to be radically modified.”


“The issue of integrity about passing off work that’s not my intellectual property is not new. That’s an eternal challenge for higher education. ChatGPT makes it a bit more interesting, but it’s the same old problem wearing a new outfit of shiny technology.”

— University Distinguished Teaching Scholar Matt Hickey

But he recognizes that the educational mission is becoming trickier in the presence of generative AI tools – and not because of cheating or plagiarism.

“There’s so much information out there, and we don’t want to just consume it,” Hickey said. “If we’re going to do well both in class and as professionals, we have to be discerning readers and interpreters, and then we have to be able to synthesize it and put it in context. We can’t simply regurgitate something that an AI device happened to share.”

CSU has been sensible in its approach to AI in the classroom, Hickey said, by offering resources and guidelines to faculty instead of asserting policies that would challenge academic freedom.

Hickey notes that technology has changed considerably in his 26 years on campus. “We’ve adapted to technology as ways to foster better learning environments for students,” he said. “I’m cautiously optimistic that we can continue to do so with ChatGPT.”

Janice Nerger, CSU’s interim provost and vice president for academic affairs, added: “I am excited about the future of AI, despite the potential challenges. Think of all the technology shifts over the past century alone and how they have revolutionized how we teach and learn, indeed, how they have revolutionized the world. For better or worse, humankind has learned to adapt and mitigate the consequences of emerging technologies. We are obligated as educators to embrace and implement new technologies in an informed and cautious manner while making sure our students are prepared for careers where AI will be an inevitable factor of how they live and work.”


Identity in the era of AI

Rosa Martey, a professor of Journalism and Media Communication, studies identity and social interaction in digital spaces. Martey said generative AI could go in both exciting and disturbing directions.

“I’m trying to be excited about this because the idea of having computers that can interact with us using regular language has been part of our imaginings of what technological systems would eventually be able to do for a very long time,” she said. “For things like games and other places where you know you’re talking to a computer, but it’s more fun and engaging when it sounds like a human, it’s going to be amazing.”

But she has reservations, especially about programs like ChatGPT and others that can generate accurate-sounding information that might be false.

“If you think of it as a search engine, you will be in big trouble because it is not looking up information,” she said. “It is predicting what text is most likely, in terms of mathematical probabilities, to be seen as the right thing to say in response to a specific query.”

AI and robots on a college campus

Martey said it’s important to differentiate between ChatGPT and other systems that may also undermine fundamentals. Decades ago, calculators and spellcheck were viewed as potential threats to education.

“Generative AI is different because it is creating output that is meant to replace the person’s act of creation,” she said. “Other tools, like spellcheck and Photoshop, are editors, in a sense. They’re helping you refine what you’ve created.”

Martey is concerned that generative AI programs will reduce students’ motivation to take intellectual risks and learn to use their own judgment. She worries that if students rely on ChatGPT, they will miss out on opportunities to develop their own voice.

She added that educators will have to rethink the work they give students in order to instill critical thinking skills.

“It’s going to take some thoughtful reflection on why we assign what we assign, how much we assign and where we can make adjustments,” Martey said. “Maybe there are ways other than writing to make sure students have the opportunity to think and practice and try out ideas, because ultimately these are very important foundations of learning.”


AI Research

Artificial intelligence isn’t science fiction anymore. This special report from SOURCE explores the importance of artificial intelligence research and what you really need to know about the potential and impact of this empowering, disruptive and complicated technology. read more