Since the release of ChatGPT-3.5 in November 2022, large language models and other generative AI (gAI) tools have greatly increased in use and popularity. Generative AI has become particularly relevant in the field of education. Now that students have access to a system that can write essays in minutes and turn books into a few short paragraphs, the question of gAI’s place in the classroom is one that can not be avoided.
Here at EMU, faculty are figuring out what gAI means for their own classes. Daniel Showalter is the program director for Math and Computer Science while also teaching math, computer science, statistics, and a section of this year’s first year CORE 103 class. He says his policies around AI are evolving but that he doesn’t want to outright ban AI in his classes. “There’s been some research that shows … when it’s banned, the professor takes more and more of a policing role,” emphasising that this can erode the trust between students and professors. Showalter also sees the potential benefits of AI as a learning tool. “In my math course I do want to, with students, figure out what’s the best relationship [with AI].” Showalter wanted to emphasize however that his stance isn’t simply pro AI, saying that he is aware of the dangers. “I think AI can be used very irresponsibly and in damaging ways,” but that “AI literacy [is] what I care about.”
This position is echoed by Carey Cole, an adjunct professor new this year to EMU’s School of Sciences Engineering Art and Nursing. Big Data Analytics is the first class he is teaching at EMU, along with Software Engineering starting in the second session. Cole believes teaching AI literacy is important, remarking “I don’t see AI going away … When you graduate you’re gonna be expected to use AI to make yourself more efficient.” He encourages responsible use of AI on his assignments. “I still need [students] to have a good comprehension of the material so that [they] still can have that expert, final say when [they] use AI.” He does not allow gAI on tests and has decided to make his tests on paper this year because he knows “the temptation might just be too great for some students.” Fundamentally, Cole believes that “it’s really important to be familiar with [AI] and how to use it in a good way.”
Associate Professor of Religious Studies Heike Peckruhn has a different mindset when it comes to gAI in the classes she teaches. Peckruhn does not allow students to use AI in her religion and ethics classes in which much of the work is reading and responding to texts. She wants students engaging directly with readings, rather than having AI summarize it. “If you don’t know how to think through something, you won’t know if AI is giving a good answer or is wrong.” A metaphor she used is that it’s like “[going] to the gym and [using] a forklift. You moved the weights but didn’t build any muscle.” She clarifies that she thinks there is “room for AI” in certain courses, but that she “[doesn’t] teach those courses.” She is “more interested in [students’] personal reflections.” Although her classroom policies may differ from Showalter’s and Cole’s, she seems to be in agreement that understanding AI and how and when it is useful is important for professors and students.
The nuances of AI are not lost on students. Willem Hedrick, a junior Digital Media and Communications major, sees the benefits and drawbacks of gAI in his work. He says that “especially in the digital media space, I’m reluctant to use AI,” stating, “My concern with the advancement of [AI] video is it taking the place of video production.” Hedrick does believe gAI can be useful in work flow: “When it comes to checking over [essays] for errors, or just helping rephrase specific things, I think that’s a good way of using [AI].” His current internship in Washington, D.C. even encourages him to use AI tools to be more efficient. Where he says he has an “ethical dilemma” is if an employer would want him to use AI to create videos.
Senior Art and Psychology major Daisy Buller has her own concerns about gAI. She states that “I think it can be useful,” but is worried many will simply use it as a shortcut to quick results. She worries that its popularity may lead to a loss of important critical thinking skills. “In certain fields it makes more sense to use it than others … [but] it makes less sense in mine, art and psychology.” She also mentioned that she “thinks there’s not enough emphasis on how bad it is for the environment and how much water it takes to run all those servers.” Sustainability is an important part of the conversation, especially if AI usage does become standard industry practice. When it comes to the classroom, the sentiment seems to be that AI can be a tool, not a shortcut to the right answers.



