As faculty grapple with ChatGPT’s emergence in student assignments and administrators field questions about AI policies, community colleges find themselves at a critical juncture. While some institutions have responded with restrictive policies and AI detection software, Oregon’s Lane Community College has taken a different approach. Through its presidential task force on AI, Lane is exploring how artificial intelligence might address persistent challenges in higher education: equitable access, student retention and the increasing demands on faculty time.
Two key figures in Lane’s AI initiative sat down to discuss their evolving perspective on AI’s role in community college education. Ian Coronado, dean of academic support and innovation, brings more than two decades of experience in educational technology implementation. Kevin Steeves, faculty instructional designer and co-chair of Lane’s Presidential AI Task Force, has been working directly with faculty to navigate AI integration in their courses. Their conversation reveals the practical implications and philosophical questions facing community colleges in this rapidly shifting landscape.
This article is part of a monthly column provided by the Instructional Technology Council, an affiliated council of the American Association of Community Colleges.
Navigating institutional response
The halls of community colleges often echo with heated debates about AI, from faculty council or senate meetings, to informal discussions in department offices. At Lane, these conversations have moved beyond simple prohibition versus permission.
Coronado reflects on his cautious optimism toward AI.
“I approach AI with a mix of excitement and caution. Like any technology, it has the potential to be both a force for good and a source of harm. Take drones, for example — they revolutionized agriculture and search-and-rescue operations but have also been weaponized. The same is true for AI: its ethical use is paramount. I see incredible opportunities for AI to support traditionally underserved students, but we must ensure those opportunities are equitably distributed.”
Steeves’ response reflects on the pragmatic approach many faculty have adopted after the initial AI panic subsided.
“I’ve been reflecting on AI and heard someone use a “fist to five” scale to rate their feelings about it — fist being “worst thing ever,” five being “best thing since sliced bread.” Day to day, I range between a fist and a five, but on average, I’m a three. I’m on the fence; AI’s impact really depends on the context. It has incredible potential but also significant risks. For every positive, there seems to be an equal and opposite negative.”
From theory to practice: AI in daily academic work
While theoretical discussions about AI’s impact abound in academic journals, community college educators face immediate practical questions: How do you maintain academic integrity when AI writing tools are freely available? What happens when students with limited English proficiency use AI for translation? These daily challenges require immediate solutions.
“AI can take over some of the repetitive aspects of teaching, like lesson planning or curating resources,” says Coronado. “For instance, I’ve used AI to relearn math concepts while tutoring my son. It’s been decades since I studied those topics, and AI gave me the tools to connect with him in a way I otherwise couldn’t.”
This practical application exemplifies how AI tools can supplement rather than replace faculty expertise. In developmental education, mainly where students often arrive with varying levels of preparation, AI offers new possibilities for individualized support.
Steeve states, “Efficiency is one of AI’s greatest strengths, but the key is using that saved time for something meaningful. If AI can help me manage a classroom of 40 students by helping me identify those who need differentiated support, I can focus my energy on the personal interactions that matter. Building empathy and understanding of each student’s unique needs makes personalized instruction possible, which is essential for impactful teaching.”
The equity imperative
As open-access institutions, community colleges serve a uniquely diverse student population. For first-generation students, English language learners, and students with disabilities, AI tools could help level the playing field – or widen existing gaps.
“AI has incredible promise in creating more inclusive learning environments,” Coronado explains. “It can generate alt text for images, transcripts, and even audio descriptions for videos, improving access for students with disabilities. Down the line, these tools could create richer, universally designed learning experiences for all students.”
The equity implications extend beyond accessibility. In career technical education programs, where industry rapidly adopts AI tools, students need exposure to these technologies as part of their workforce preparation.
“Universal design benefits everyone, not just those with accommodations,” Steeves adds. “AI can present materials in multiple formats, meeting diverse learning needs. If implemented thoughtfully, it can break down barriers to access and promote equity.”
Professional development in the AI era
As institutions develop AI policies, the need for faculty professional development becomes increasingly apparent. Lane’s approach focuses on peer-led exploration rather than top-down mandates.
“AI can support social interaction by helping students practice conversations or scenarios,” Coronado says. “It could even foster empathy by deepening understanding among students and educators. But transparency is key. Framing AI as a practice tool, rather than a grading solution, ensures that students see it as a resource, not a substitute for learning.”
Looking ahead: Strategic considerations
For community college administrators, AI presents both budgetary and strategic challenges. How do you fund AI tools and training while maintaining essential services? How do you ensure ethical implementation across diverse academic programs?
As institutions navigate these questions, some are taking bold steps toward local AI solutions. “I was speaking with a colleague at another institution this morning about how they were setting up Llama at their institution,” Coronado says. “The amount of funds they’re putting into it is significant, but it’s not unobtainable.”
This shift toward institutional AI infrastructure represents a critical strategic choice. As Steeves observes, “There’s a lot of potential in on-device or small language models that we don’t have to call to the cloud every single time.” Such approaches offer greater control over training data and bias management but require significant institutional capacity. “It has to have the digital literacy skills in the team around us that can make that work,” he adds.
Beyond infrastructure decisions, Coronado emphasizes the importance of ethical implementation. His vision for institutional AI includes transparency about model development, appropriate licensing of training data, and fair compensation for those involved in development – what he characteristically calls “sustainable, farm-to-table AI.”
Conclusion
Rather than presenting AI as either a savior or a threat to community college education, Lane’s experience suggests a more nuanced approach. The key lies not in the technology itself but in how institutions adapt their policies, practices, and pedagogies to leverage AI’s potential while preserving the essential human elements of teaching and learning.
The questions facing community colleges aren’t just about whether to use AI, but how to use it to advance their core mission of providing accessible, high-quality education to their communities. As Lane’s experience demonstrates, finding answers requires ongoing dialogue
between administrators, faculty, staff and students about the role of technology in achieving educational equity and excellence.
“My hope is AI helps bring us closer as humans by giving us more time, broadening our perspectives, and enhancing our ability to truly see and hear one another,” Steeves says. “By freeing up time for meaningful interactions and offering tools to support personalized learning, AI can help educators focus on what matters most: understanding and inspiring their students.”
* * *
Kevin Steeves is an instructional designer at Lane Community College (LCC) in Eugene, Oregon, and co-chair of the Presidential AI Taskforce. With over 22 years in education, he leads discussions on the ethical integration of AI, fostering collaboration through initiatives like the Fusion Lab and a Community of Practice on Generative AI. His work emphasizes leveraging AI as a tool to enhance learning while preserving the human connection in education.
Ian Coronado is dean of academic support and innovation at LCC where he oversees departments and initiatives focused on digital and online learning, educational technology, information literacy and open educational resources. He serves on the boards of Oregon WIN and the Instructional Technology Council and is the operations workgroup chair of the Oregon Community College Distance Learning Association, supporting statewide efforts to reduce barriers to education.