cropped-leaderlogo.pngcropped-leaderlogo.pngcropped-leaderlogo.pngcropped-leaderlogo.png
  • Home
  • News
  • Arts & Culture
  • Sports
  • Opinion
  • Media
    • Cartoons
    • Podcasts
    • Videos
✕

Why AI Policies Differ Across Syllabi: Faculty Explain Their Classroom Approaches to Generative Technology

Published by Ashley Vanderhoff on March 3, 2026

By the fourth week of an eight-week networking project, a few students were already finished — not haphazardly, but with full understanding. They had built and designed an entire network from scratch in half the time, using artificial intelligence as an assistant. 

A few other students in the class stared at work they could not explain, their uncritical use of  shortcuts causing them to struggle, leaving their professor to dismantle answers they hadn’t fully learned.

The assignment was part of lecturer Kip Carlson’s networking class, where students were encouraged to use AI to assist with configurations, which mirrors how AI is being utilized in the networking industry.

But that divide in students — echoing the question on whether AI will enhance or hinder learning — extends beyond one classroom. 

AI policies for students differ across courses and departments at Elmhurst University. These policies have always been left to individual faculty members. However, since J-term, the university has provided three supported policy options that professors may adopt when creating their syllabi, ranging from prohibiting generative AI to allowing much broader integration.

Professor and Associate Dean of Faculty Kimberly Lawler-Sagarin helped author the policy language after faculty requested standardized statements. 

“It turns out that people were very receptive to having some written out policies to choose from,” said Sagarin. 

“The idea that we could have one size fits all was not really going to happen,” she added. “I know AI is very pervasive and is changing a lot of things. But, faculty do decide on the parameters for their other assignments — on whether it’s an open book, open note, or take home test. We’ve always made those rules around our assignments and assessments.”

Carlson, who teaches graduate courses on AI and hopes to continue teaching it to more undergraduate students, understands both the opportunities and concerns the new technology poses.

“The real challenge is helping students learn to think with AI — to use it to accomplish more — without surrendering their cognitive skills,” said Carlson. “If they give up the thinking, AI does more damage than good. Used properly, however, it becomes a powerful amplifier of learning and creativity.”

“I spend a lot of time working with the students trying to point out that they still need to keep themselves in the loop, and they need to think about what they’re doing,” he added.

Carlson believes that EU should take on more of a leadership role when it comes to AI.

“I’ve been screaming at the top of my lungs that Elmhurst needs to get more involved with this, and they’re moving at a snail’s pace,” said Carlson. “I think we’re doing the students a disservice by not being more forefront in showing them how to use this.”

Sagarin explained that opinions on AI use vary across departments, resulting in differing classroom policies. As a result, not all faculty use AI the same way. 

Carrie Hewitt, a professor of psychology and dean of the School of Graduate Studies, allows students to use AI on certain assignments.

“I do want to encourage students to use it, but I also want students to use it appropriately, right?” Hewitt said.

Hewitt has found AI helpful for a range of student projects, from generating interview questions based on job descriptions to summarizing or formatting information. She also has her students use the library’s APA academic writing tool, which provides prompts and suggestions while they write. Because students submit multiple drafts, she becomes familiar with their voices, which makes it easier to identify any overuse of AI.

“The one thing though, that I do caution students with AI is that they do have to double-check things. AI is not 100% yet,” said Hewitt. “There are things being pulled from different sources on the Internet that might not be correct.”

Hewitt explained how AI skills are increasingly relevant in the workplace.

“I thought about workplaces and a lot of my classes. I mean, AI may take your job,” said Hewitt. “But really it’s going to be someone using AI that’s going to take your job.”

Many professors, however, remain skeptical of AI’s place in the classroom. 

Assistant professor and program director of digital media John Klein discourages AI use in his classes. 

“Our goal as professors is to teach students to learn how to be creative, to learn how to think for themselves, to analyze media and stories and what they are trying to say,” said Klein. “If students are resorting to using AI to give them those things without having to do the legwork…they’re not going to know if what AI is giving them is any good or not, or if it’s useful, or if it’s flat-out lying to them — because they haven’t built that muscle yet. Our job is to build that muscle, rather than train them how to write a prompt.”

Still, Klein believes that AI can be used sparingly.

“I’m not somebody who is 100% negative on AI all the time,” Klein said. “I think there are places in which AI is a time saver that allows people to be more creative, that allows people to find ways to contribute beyond simply the task at hand.”

Students continue to encounter classes that both integrate and prohibit AI as professors navigate its evolving capabilities.

Sophomore Grace Volkmar was allowed to use AI in her math class last year — this semester, all of her professors have banned it.

“I find more often that AI is not allowed,” said Volkmar. “I kind of understand why you wouldn’t use AI for writing, because you want to know your own thoughts and not the thoughts of computers. AI is likely to get stuff wrong too when you ask a question. So, I personally don’t really use it.”

Kip Carlson’s networking project reflects the range of approaches described by faculty. With policies shaped at the course level, students continue to encounter differing expectations as instructors determine how, or if, AI fits into their classrooms. 

“We just have to keep doing what we do as professors,” said Klein. “We have to experiment. We have to try new things. We have to care about our students enough to give them the patience to try and find that for themselves. There’s no easy answer to it.”

Related posts

Kathleen Arnold poses for a headshot in Founders Lounge on April 14. (Ian Murphy)

April 21, 2026

Niebuhr Lecturer Argues we Need to Act Now to Prevent Totalitarianism


Read more

A man walks past a banner with pictures of Iran's slain supreme leader Ayatollah Ali Khamenei (R) and his son and successor Mojtaba Khamenei, installed along a street in Tehran on April 15. (AFP/Getty Images/TNS)

April 21, 2026

The Ever-Changing Motion of the Iran War


Read more

A student studies in the Frick Center on April 19 behind an SGA funded and installed mobile device charging stand. (Ian Murphy)

April 21, 2026

SGA Bound by Internal Anti-Bias Rules, “Can’t Say No” to TPUSA


Read more
Advertisement

About Us

Our Mission

Advertising

Letter to the Editor

Frequently Asked Questions

Contact Us

Categories

News

Arts & Culture

Sports

Opinion

Social Media

TikTok

Instagram

YouTube

LinkedIn

Media Hub

Cartoons

Galleries

Podcasts

Videos

© 2026 The Leader. All Rights Reserved.