computer with chatgpt logo

Five key insights into ChatGPT students need to know

by Brendan Henry
published April 16, 2024

ChatGPT has taken the world by storm in a relatively short time. The creators of the trending chatbot, OpenAI, released a demo for ChatGPT in 2022. Since then, educators have been forced to discuss the artificially intelligent elephant in the room.

Discussions have been rooted in finding a balance between fear of compromising academic integrity and using artificial intelligence as an instructional tool. 


Editor’s note

This story was produced in partnership with the Department of Journalism and Media Communication as part of a special class in which students get hands-on experiences developing and producing content with the Division of Marketing and Communications for SOURCE, the news website of CSU.

In February 2023, the Provost’s Ethics Colloquium held a public discussion at the Lory Student Center at Colorado State University regarding the impact of ChatGPT on academics. Matthew Hickey, the associate dean for research and graduate studies in the College of Health and Human Sciences, moderated the discussion and provided his own insights and advice for the ever-evolving chatbot.


1. Understanding the potency of ChatGPT and AI

Hickey notes that it did not take long for ChatGPT to ascend to the level of proficiency that it has. From the early days of the beta at the end of 2022, the chatbot can now crank out an entire essay in less than a minute, carry out a comprehensive and intelligent-sounding conversation with the user, and even write code. With how quickly the program has developed, Hickey cites the uncertainty that both he and his colleagues faced, especially in terms of academic accountability mechanisms.

Not only is ChatGPT a concern academically, but it has already resulted in job loss – specifically in writing-centric industries. ChatGPT and other programs have become so concerning that OpenAI and other AI experts have signed a statement for mitigating the risk of human extinction as a result of AI. The statement equates the human extinction resulting from AI to that of pandemics and nuclear war.

 

 2. ChatGPT should be a tool, not a ghostwriter

While using ChatGPT to write an essay for a class is not exactly comparable to nuclear war, it still takes a toll on the student. Hickey says that copying and pasting from ChatGPT can cause intellectual harm.

“We’ve got to create a climate on campus where the students themselves can realize that, ‘I’m actually doing myself harm because I’m not learning, and that’s gonna hurt me when I’m going out looking for a job, etc.,’” Hickey said. “Because we’re not going to be able to rely on these large language models and in every setting we find ourselves in for the rest of our lives.”

 

 3. Transparent and mindful use of ChatGPT

Hickey does not feel the need to ban ChatGPT from his classroom but instead expects transparency from his students if they use it as a tool. He wants to be sure that his students are turning in their own writing and not dragging and dropping entire sentences or sections from the program, something he considers a violation of academic integrity. Hickey says it can be beneficial to probe the chatbot as if it were another conversational partner in a classroom setting.

While Hickey allows the use of ChatGPT in his classes so long as his students communicate their use to him, he knows that some educators on campus may not have the same policy and disallow the chatbot altogether. In this case, he feels that students should respect the rule and refrain from using it.

 

 4. ChatGPT runs on an illusion of intelligence

A glaring issue brought up by Hickey and many other users of ChatGPT is how good it is at making things up and sounding confident about it.

“ChatGPT and other large language models are trained to generate plausible texts,” Hickey said. “It’s not trained to deliver truthful text. It’s just plausible.”

Hickey cited a physician’s use of ChatGPT to model the pulmonary blood flow responses to particular stressors. What the chatbot generated was completely implausible for humans, but for medical students who may not know any better, the results could be misleading and problematic. This is why Hickey stresses the importance of knowing that ChatGPT can be used as a tool, but also that it is still full of flaws and requires fact-checking.

 

5. Artificial intelligence is here to stay

Technology continues to grow and develop, and with that AI. There is no avoiding it, and ignoring the reality, as Hickey suggests, will not be beneficial. Educators will likely be implementing AI like ChatGPT into courses as a tool, similar to how Hickey and other instructors have.

This goes beyond CSU. Abram Anders, the associate director of the Student Innovation Center at Iowa State University, also has been working on AI tools to use in the classroom. Anders was a keynote speaker at the Provost’s Ethics Colloquium last November.

“You don’t have to be a technical optimist or an AI optimist to engage these tools,” Anders said at the Provost’s Ethics Colloquium. Anders referred to a student of his who is pursuing a career in creative writing, and the student initially held a pessimistic view of AI, fearing it would hinder career prospects. However, through the course, the student realized that AI still requires human guidance, alleviating concerns about the impact on their career potential.

Regardless of the future steps universities and educators take, ChatGPT and AI programs are here to stay.