The 2025 Online Teaching Conference (OTC), held June 16 to 18, brought together educators, technologists, and administrators to explore how artificial intelligence is reshaping the educational landscape. Conversations centered on the urgent challenges and opportunities AI presents for teaching and learning, especially in California's community colleges. What emerged was a shared sense of both disruption and possibility: faculty are adapting to classrooms where students rely heavily on AI, often without fully understanding it, while institutional policies and support systems struggle to keep pace.
Here are the key insights and takeaways from the frontlines of online education.
Policy Isn’t Keeping Up
Most schools still lack clear, modern policies around AI use. Where policies do exist, they often defer to the instructor. As Sarah Lisha, English Faculty at De Anza College, put it, "Our policy says it's up to the teacher to define a policy. If they do, we'll back them up."
This decentralized approach results in a patchwork of micro policies, with different rules for every assignment or class. While this can provide flexibility, it also creates confusion for both students and instructors.
Bill Moseley, Computer Science Professor at Bakersfield College, emphasized, "Pedagogy is contextual. Most of us weren't trained in teaching, we were trained in our subject. We need tools that support different approaches, not force conformity."
Takeaway: Institutions must establish clear, cohesive policies that support diverse pedagogical approaches and reduce confusion for students and faculty alike.
When Cheating Signals a Design Problem
A key theme was that cheating often reflects course design flaws rather than student ethics alone.
This is not to excuse dishonest behavior, students must take responsibility. But when assignments feel disconnected or irrelevant, students are more likely to seek shortcuts. Moseley shared research showing a direct link: the less personal, authentic, and creative a learning experience, the more likely students are to cheat.
Effective design makes a difference. Amy Leonard, English Faculty and AI Subcommittee Chair at De Anza, redesigned a tutor training assignment to involve students in co creating AI usage guides. Engagement increased dramatically. When assignments are practical, relevant, and collaborative, dishonesty goes down.
Takeaway: Authentic, engaging course design can significantly reduce academic dishonesty and enhance learning outcomes.
Students Lack AI Literacy
AI literacy has become the new digital literacy. Students aren’t just using these tools, they're relying on them, often without understanding the implications.
Leonard described students who let bots manage everything from assignments to due dates, sometimes following incorrect bot generated timelines instead of the official course calendar. This overreliance reflects a lack of skill and critical thinking.
As one STEM faculty member noted, "Just because it speaks confidently doesn’t mean it’s right."
Takeaway: Students need structured support and instruction to develop AI literacy and use these tools critically and responsibly.
AI Isn’t Neutral
AI tools are not objective. They reflect the biases of their creators.
Safiya Noble, Professor of Gender Studies at UCLA, cautioned that generative AI can erase student identity and experience. "Students who use AI to 'sound more academic' may unknowingly erase their own voice. That's not equity."
Access also matters. Better tools yield better results, but those tools often come with a cost. Alison Gurganus, Emerging Tech Specialist at San Diego CCD, pointed out that most students can’t afford premium AI subscriptions. "We say students can use AI in class, but do they all have equal access to those tools? Probably not."
Takeaway: Ensuring equitable access to AI tools are essential for fostering inclusive and fair learning environments.
Students Aren’t Thinking About Privacy
Data privacy is largely off students' radar.
Gurganus shared examples of students casually inputting sensitive information like student IDs or personal preferences into AI platforms. They often don't realize that data may be stored, sold, or reused.
One striking example: the viral Barbie Box Challenge, where students unknowingly fed personal traits and photos into a chatbot, contributing to its training model.
Faculty have a role in raising awareness. This isn’t just about privacy, it's about agency.
Takeaway: Faculty must guide students in understanding and protecting their data privacy in an AI driven world.
AI Can Help, But Only With Guidance
AI isn’t the enemy. But without human oversight, it can become a trap.
Leonard summed it up: "Students expect us to teach them how to use AI ethically. If we don’t, we’re not just failing them, we’re making ourselves irrelevant."
Integration is key. AI tools should be part of a broader learning model, not add ons. Moseley reminded us that meaningful learning is still active, personal, and creative.
Noble offered a final warning: technology should support education, not replace the educators who make it meaningful.
Takeaway: AI must be thoughtfully integrated into pedagogy with strong ethical guidance and human oversight.
Redefining Learning Integrity
A bigger issue emerged: many faculty and students don’t fully understand what learning integrity means in an AI enhanced environment. The concept is evolving, and the lines are becoming increasingly blurred. This makes updated, clear, and shared policies all the more essential.
Takeaway: Institutions must redefine and clearly communicate learning integrity standards to reflect new challenges in the AI era.
The Takeaway
What students want isn’t complicated: relevant assignments, real world context, effective tools, and instructors who explain the "why," not just the rules.
Faculty need flexible tools, clear policies, institutional support, and ways to teach with integrity in an AI enabled world.
The 2025 Online Teaching Conference made one thing clear: the future of education isn’t about fighting AI. It’s about shaping it responsibly, equitably, and together.