AI in education: Moving beyond academic dishonesty

iStock

Artificial intelligence (AI) has quickly become a hot topic in conversations about education policies. When it first started gaining attention in academic circles, many of us were concerned about how it might be misused — especially in terms of cheating. But does AI automatically lead to academic dishonesty, or can we approach it in a more balanced way?

At our college, Pikes Peak State College, we initially took a cautious approach, treating AI like a potential threat to academic integrity. This perspective is clear in our current policy, which states:

“The use of artificial intelligence (AI) for any written or spoken graded content is strictly prohibited unless explicitly permitted in writing by the instructor. Students must demonstrate knowledge, understanding, independence, and integrity in their academic work.”

This article is part of a monthly column provided by the Instructional Technology Council (ITC), an affiliated council of the American Association of Community Colleges.

However, as AI has become more integrated into everyday life, it became clear that this one-size-fits-all policy wasn’t cutting it. I, as co-chair of the ITC AI Affinity Group, as well as a member of the Pikes Peak State College Technology Committee and its AI subcommittee have delved into the complex world of AI. What became clear is that there are countless ways people can use AI without simply asking it to write an essay. For example, what about students who use AI to help them communicate or generate HTML code? Or students who use it to organize their thoughts and brainstorm ideas? These uses don’t replace learning — they enhance it.

This realization led us to form an AI focus group at the Pikes Peak State College, with the goal of exploring how to use AI as a tool rather than a threat. We want to shift the conversation from AI being something to avoid to something that can actually support student learning. As a result, we’ve proposed some changes to the college’s AI policies that move away from a strict ban and instead explore how AI can be used ethically in education.

  • AI as a learning tool: AI should be used to help students think, not think for them. It’s great for brainstorming, outlining, coming up with ideas and making sense of complex problems — but the student still has to do the heavy lifting.
  • Responsible and ethical use: When using AI, students need to be upfront about it. AI’s contributions to academic work should be clear and shared responsibly.
  • Ownership and accountability: Students are fully responsible for their final work. If AI generates a mistake or provides incorrect information, it’s on the student to fix it. This means facts should be verified, sources checked and ideas attributed properly.
  • Transparency: Students should always be open about using AI in their work, citing it just like any other source.
  • Cheating: Using AI to complete assignments, projects, tests or any other academic tasks without clear permission from the instructor is considered academic dishonesty.
  • Violating privacy: AI should never be used to monitor or track people without their consent, or in ways that invade their privacy.
  • Bias and discrimination: AI tools must be reviewed regularly to make sure they aren’t biased or discriminatory. Any use of AI that reinforces unfairness or inequality won’t be allowed.

Embracing AI as a learning tool

AI is here to stay, and its role in education will continue to grow. Instead of focusing on what could go wrong, we want to embrace the potential AI has to support students, help them think critically, and deepen their learning. By updating our policies, we aim to create an academic environment where AI is seen as a helpful tool, not a shortcut, and where integrity and transparency remain central to everything we do. I encourage you to do the same, whatever that may look like at your institution. 

About the Author

Cynthia Krutsinger
Cynthia Krutsinger is dean of online learning at Pike Peak State College in Colorado Springs, Colorado. She is a board member of the Instructional Technology Council and serves as co-chair of its AI Affinity Group.
The owner of this website has made a commitment to accessibility and inclusion, please report any problems that you encounter using the contact form on this website. This site uses the WP ADA Compliance Check plugin to enhance accessibility.