We are anxiously in-between the release of ChatGPT and its full effects on education, unable to imagine academia without it and unable to fully envision academia with it. And it is exciting. Artificial intelligence will prove transformational; a tool both students and professors must embrace.
But conversations and collaboration must remain at the forefront of how we learn. We must not rely on simply being told answers and seemingly right information, and as Harvard Professor Anna Wilson reminds us about ChatGPT, “it’s not actually intelligent.”
“You have to stop thinking that you can teach exactly the way you used to teach when the basic medium has changed,” stated Houman Harouni, Harvard Graduate School of Education lecturer, in a GSE article written by Elizabeth M. Ross. While most Harvard professors agree that AI is here to stay and will only benefit teaching, they disagree on what role it should play in the classroom, as artificial intelligence has concurrently been integrated into our courses in ways that have rendered certain traditional methods obsolete. It simultaneously modifies what skills are focused on in the classroom and how students demonstrate their knowledge. The future of what our Harvard education and courses will look like remains uncertain— in this uncertainty, we must continue to learn from other living humans, not our screens.
We should all be immersing ourselves in the benefits of ChatGPT. Immediate access to knowledge is at our fingertips, and its accessibility transforms productivity. It can be a wonderful aid for students to answer their questions in layman’s terms and effortlessly scour the web for evidence and sources. It can take on a personna to challenge our thoughts, ask us questions, and even incentivize us to ‘think’ deeper than it does.
But, ChatGPT really does not think. It simply scans through and regurgitates information from the internet, unable to provide any original idea or be imaginative, and, much of the time, it is wrong or spits out illegible responses. While most of the time it does give perfect answers to college-level math problems or provide a concise and factual essay draft, students are consequently told how to do something, rather than learn why. It’s a stark contrast to the inherent curiosity of humankind.
Indeed, generative AI threatens higher education with the question: what skills are worth learning and what skills can be automated? Harvard Dean of Undergraduate Education Amanda Claybaugh remarked, “I think the answer is going to vary by discipline. For instance, it may well be fine for students in science courses to do the labs themselves then rely on generative AI to write up the results. But in a literature course, the writing is inextricable from the thinking and probably shouldn’t be automated.”
Unlike Google, ChatGPT understands user input, blurring the lines between clarifying questions and finalized responses. Moreover, it is technically impossible to detect when AI has been used, redefining what cheating is–currently technology is only 26 percent accurate at detecting AI-written text. Harvard’s new guidelines for ethically using generative AI gave professors free reign; course syllabi had to make their policy clear. Informational sessions held last August provided some structure for instructor use in STEM and writing courses.
The Economics 10 series, taught by Jason Furman and David Labison, dropped an essay assignment after an experiment proved ChatGPT work could pass Harvard classes. Furman tweeted, “Sadly, we are planning to drop the essay this coming year, in part because ChatGPT has reduced the marginal net benefit that comes from this essay.” But where do we draw the line? Do we take away all minor assignments that had previously forced students to think?
Head section leader David Martin affirmed, “adding ChatGPT into the equation on that essay … tipped the balance of the cost-benefit analysis. If there is one part of the course that we are least confident is going to be good for student learning, it would be that part.” Assignments must push students one step further in pursuit of academic integrity beyond what generative AI can produce in a second. In comparison to writing, students must understand the process themselves through solving problem sets. In a time crunch, generative AI can forgo this battle of confusion with quick answers and thorough explanations.
Chat GPT posed a seemingly immediate threat to writing. At Harvard, the first-year Expository Writing Course requirement fully banned the use of any artificial intelligence for any step in the writing process to emphasize its challenges. But, for English students, Wilson explained,“they want to go on to be creative writers. I think that a lot of English teachers at Harvard are sort of not really actually that concerned about students somehow using AI to take shortcuts because they know that our students do actually want to learn those skills.” While the curiosity and academic integrity of Harvard students may reduce the outright use of ChatGPT for writing essays, students are inevitably using and relying on it.
Other professors integrated and encouraged AI use. Harvard Professor David Atherton’s General Education 1067 class explored artificial intelligence’s capacity for creativity for one unit even before the release of ChatGPT, easily expanding to integrate it. Regarding his other courses, Atherton says, “I think I will try to incorporate it a little later in the semester around Japanese poetry and trying to understand how Japanese poetry works. The students in that class actually produce their own poetry anthologies as one of the assignments. And I’m gonna make it very open to them if they want to use AI as part of that process there.” Artificial intelligence as a learning tool and way to teach students how to think but will also impact who students turn to for help—or who they do not.
The CS50 AI Duck is a great example of such. Released the spring of 2023, the on-demand duck poses as a virtual teaching assistant to guide students through understanding and debugging code specific to the course. Instead of producing code like ChatGPT would, the duck attempts to guide students to the right logic. ChatGPT’s ability to reproduce basic code parallels its dangerous ability to craft essays.
While turning towards a computer to seek help might seem to disrupt teacher relationships, it might just transform conventional learning methods into something new altogether. “Whether AI replaces human teachers is ultimately up to us humans,” Harvard Computer Science Professor David Malan said, “but I do think AI is poised to amplify the impact of individual teachers. If so trained, AI could effectively enable teachers to help all the more students in parallel, at all the more hours, in all the more places.”
In their journey of learning, students crave efficiency and maximizing productivity. Generative AI poses a wonderful tool for saving students time; however, it must not come at the expense of displacing conversation or threatening academic integrity. We must remember and fight against the potential for loss of the most valuable aspect of education—discussion and collaboration with other people.
*Quotes have been adjusted to account for filler words and grammatical correctness.
Meena Behringer ’27 (meenabehringer@college.harvard.edu) loves to play games with ChatGPT.