September 18, 2023

ChatGPT does take-home assignments – but can it do citations?

Posted by Laura Guertin

I have never seen a topic that has generated such a flurry of discussion and faculty development workshops as the entrance of ChatGPT. Although there is such energy and engagement around the topic, the reactions from faculty have been wide ranging. Some faculty are dismissive of the impact ChatGPT will have, while other faculty have been quick to change their assessment strategies, such as what was detailed in this Chronicle column on August 24, 2023:



The article describes how an instructor of political science had his high-school daughter enter several of his take-home essay questions in ChatGPT. She was quick to figure out that by refining her inputs, the machine essay was narrower in scope and a more accurate output for the original prompt. Some of the golden lines (see description) for me from this article include:

  • Academic writing has never simply been about producing good papers.
  • If the students are cheating, it’s because we’ve failed. The solution must be that we need to work harder.
  • At a certain point, it [responding to take-home essays] risks demanding too much of the students: expecting a superhuman effort on their part, just for the sake of proving their humanity.
  • The issue is not punishment but pedagogy.

In the end, the author made the decision to have all writing, including for mid-terms and final exams, be completed in-class.

The Chronicle has continued publishing articles relating to ChatGPT. A September 12th essay by Michael Clune discussed further how AI Means Professors Need to Raise Their Grading Standards. I was struck by this quote:

“…I believe that two related pre-existing problems in higher education have made a technology that ought to be a useful tool appear to many instructors as an existential threat. The first is the phenomenon of grade inflation. The second is a lack of clarity about what we want from student writing.”  —  Clune (September 12, 2023)

But what really struck me (and maybe was my wake-up call to ChatGPT, since I have students engage in writing science narratives and science storytelling) was an email that came from one of my campus faculty librarians. She was also sharing a Chronicle article, and her email said: “What happens when ChatGPT is asked to find sources on a particular topic? Spoiler: it makes them up.” To me, this article is a must-read: No, ChatGPT Can’t Be Your New Research Assistant by Maggie Hicks (August 23, 2023). I learned that perhaps ChatGPT can write, but it can’t find sources. It certainly (at this time) can’t access paywalled articles. And this article gives an example of what is produced by ChatGPT when asked to provide three academic articles on great white sharks. The citations may have real scientists as authors and real journal names, but read carefully – and check out those DOI numbers…

I don’t have any solutions to suggest for addressing ChatGPT with students and defining assignments to outsmart it. Duke University Libraries has a blog post on ChatGPT and Fake Citations that can serve as a model for us to look at. If anything, the value of information literacy has been elevated even more and shows how critical it is to help students learn content and develop skills around information.