Academic papers written by AI get a solid B—but is it cheating?

 

The rise of ChatGPT is challenging our approaches to teaching and assessing writing in the liberal arts and sciences, writes SRI Faculty Affiliate Rhonda McEwen, who argues that generative AI technologies are here to stay and should be part of the world for which we prepare our students. This article appeared in The Toronto Star on January 22, 2023, and is republished with permission.


On the heels of major universities and colleges banning the use of a new, extraordinary artificial intelligence tool called ChatGPT for academic papers, we should challenge ourselves to reconsider the implications of, and our responses to, this technology.

I propose that it changes the game and that we need to redesign our play. My social feeds exploded over the holiday break as my academic colleagues faced the real-life implications while grading papers. ChatGPT had been launched a month earlier and it was good, very good at writing student essays. Students asked their computer questions about their essay subject matter and received answers that resulted in a paper with a solid B mark.

The panic and outcry is predictable. For many academics, the sophistication and accuracy of the essays made it nearly impossible to identify cheating students and threatened to undermine the foundational principles of our education system.

But many students don’t consider using AI to write their papers cheating. I understand the ambiguity. Probably not the anticipated response from the president of a university, however, as a leading STEM researcher, I think it’s worth considering the extent to which this AI is fundamentally changing the game and how we could adapt.

AI will draft legal briefs, business proposals, contracts, procedure manuals, and perhaps stories in this very newspaper. What are the implications if and when AIs like ChatGPT become as ubiquitous as spell check? We will still need the analysis, scrutiny, context-awareness, and the humanity of people when creating these materials, but the technology is now at the stage that will make this work faster and increasingly useful across many domains.

In academia, instead of focusing on how to distinguish AI-generated papers from traditional methods, we all need to take a seismic leap into a future that will demand an even more creative and analytical approach to information. Writing competencies will continue to be a key curriculum focus; however, teaching new digital literacies, like taking the AI draft, locating deficiencies, and improving on them may be a necessary additional competency. Moving the AI draft from a B to an A-plus can be a goal.

When digital calculators hit the market, there was an outcry from mathematicians. Like ChatGPT, they were designed for the wider world, but impacted classroom teaching.

Over the past decades computer scientists, linguists, mathematicians, philosophers, and engineers have worked across domains to improve the computational prowess of machines. From banking to law enforcement, the capacity for algorithms to predict—and importantly, to learn how to refine their predictions—has shattered expectations.

 
Rhonda McEwen

“This is a moment to challenge our approaches to teaching and assessing writing,” writes Rhonda McEwen, “but it is also a moment to reflect how this game changing technology will impact all of our lives.”

 

The rousing call to build tools to root out the AI from the essay and penalize the student for augmented work leans into the historical tenet that we protect the rights of authors by requiring attribution through citation for use. I agree that originality should be rewarded and that citation is a fundamental tenet. We need to teach skills to assess where the data is coming from, recognize the authors, and question who or what is missing from the data. We can consider the chatbot a compiler of data that is then prepared as output in a naturalistic format, something like a first draft engine.

I offer a provocative suggestion that if tertiary education is about developing thoughtful, creative, and analytical people for any later pursuit, then these types of technologies, which are here to stay, should be part of the world for which we prepare our students. Technologies make some things obsolete and drive the development of new skills.

In programming, we teach students both how to write code, but also how to reuse code that has already been written. This is a moment to challenge our approaches to teaching and assessing writing in the liberal arts and sciences, but it is also a moment to reflect how this game changing technology will impact all of our lives.


About the author

Rhonda McEwen is president and vice-chancellor of Victoria University in the University of Toronto. She holds the Canada Research Chair in tactile interfaces, communication, and cognition and is leading a research project on AI language models using a forerunner of ChatGPT.


Browse stories by tag:

Related Posts

 
Previous
Previous

Digital literacy will be key in a world transformed by AI

Next
Next

Successful use of AI in government means doubling down on human and democratic values