July 14, 2025
TRENDS & INSIGHTS
Kevin RockmaelJuly 28, 2025
At this year’s Anthology Together conference, artificial intelligence wasn’t just a topic on the agenda. It was the conversation. Whether in keynote halls, breakout rooms, or impromptu chats between sessions, everyone was asking some version of the same question: What do we do now that AI is changing everything?
After listening closely to speakers and participants alike, three major themes emerged. These takeaways reflect not just what the experts are saying, but also what’s on the minds of educators, administrators, and innovators facing these shifts in real time.
Here’s how it all came together.
1. Nobody Fully Understands AI, and That’s OK
One of the most honest moments came from Ethan Mollick, Associate Professor at the Wharton School and Co-Director of its Generative AI Lab. He set the tone with this simple statement:
“Nobody knows anything… We are all figuring this out as we go along. There’s no secret manual.”
— Ethan Mollick, Anthology Together 2025
Coming from one of TIME’s 100 Most Influential People in AI 2024, this was refreshing to hear.
The theme of ambiguity was echoed again and again, both on stage and in side conversations. From experienced technologists to classroom faculty, there was a shared sense of learning together.
Michelle Singh, Vice President for Strategic Educational Alliances at the University of North Texas, reinforced this during her session with Dr. Justin Louder. She acknowledged how quickly expectations are shifting:
“We’re expected to lead others… for a subject we’re still getting acclimated with ourselves.”
This sense of shared uncertainty may sound intimidating. But it actually creates space for experimentation and collaboration. It removes the pressure to have all the answers and puts the focus where it belongs, on action and learning.
2. AI Adoption Is Already Everywhere, and It’s Accelerating
Many educators are still figuring out how AI fits into their work. Meanwhile, students are already using it daily.
Multiple speakers shared compelling data points that left no room for debate:
This is not a future issue. AI is already part of how students learn, how knowledge is created, and how work gets done. Pretending otherwise just puts institutions further behind.
Justin Louder, Associate Vice President of Academic Innovation at Anthology, summed up the challenge clearly:
“We’re flying the plane while building it… but the key is to do something. Start somewhere. Start experimenting.”
In short, you don’t need to have it all figured out. But you do need to get involved.
3. You Can’t Ignore Integrity or Trust, But You Can’t Avoid Technology Either
As AI use becomes more common, institutions are being pulled in two directions. One group is trying to push back, requiring handwritten essays, in-class exams, and stricter bans. Another group is diving in, exploring AI-enhanced instruction, updating syllabi, and rethinking assessment.
Each approach reveals something important.
Going back to the basics offers short-term control. But it can also distance students from the skills they’ll need after graduation. On the other hand, diving in without guidance risks confusion or inconsistent use.
That’s why the best strategy is to meet AI head-on with intention. Support your faculty. Talk with students about responsible use. Build clear policies. And most importantly, choose tools that help you balance innovation with integrity.
This is where quiet but effective tools make a difference. For example, platforms like Proctorio help instructors give secure, flexible exams. It’s easy for teachers to use and offers plenty of options. Instead of getting in the way, Proctorio creates a safe framework that supports academic honesty while still allowing students and teachers to use AI responsibly. Tools like this help schools change and grow without losing trust.
Bonus Insight: The Future Belongs to People and Machines Working Together
Throughout the conference, a core message kept resurfacing. It’s not humans versus AI. It’s humans plus AI. Mollick calls this co-intelligence, the idea that the most powerful outcomes happen when people and machines collaborate.
“Co-intelligence is the future… where humans plus AI do more together than either could alone.”
This philosophy applies across the board. Teachers using AI to personalize feedback. Students using AI to explore ideas. Administrators using AI to improve services. TAs using AI to help identify high-risk test-taking behavior.
But in every case, the value comes from people driving the process, guided by purpose and supported by the right tools.
What Should Institutions Do Next?
The takeaway is not to wait. It’s to start small and move forward with curiosity.
Begin by using AI for low-stakes tasks, such as summarizing a lesson plan, drafting a policy update, generating quiz questions, or testing out AI-powered tools for small jobs. Talk to colleagues. Ask students how they’re using these tools. Audit your current tech stack and look for opportunities to improve agility and trust.
And don’t forget to collaborate with your technology partners. You might even co-build solutions that better meet your evolving needs.
As Mollick said:
“This is the fastest adoption of any technology in history… There’s no going back.”
That doesn’t mean rushing blindly. It means stepping forward with purpose, clarity, and a willingness to learn.
We recommend diving in.
July 14, 2025
TRENDS & INSIGHTS
June 25, 2025
TRENDS & INSIGHTS
June 20, 2025
TRENDS & INSIGHTS