Six Months Later: The Evolution of AI in Education Continues
The story of artificial intelligence (AI) in education is evolving rapidly. When I wrote about AI last summer, I purposefully called it a “snapshot in current time” because things were moving fast. In just six short months, the landscape has shifted dramatically.
As a school technology director, I am genuinely excited about what AI can do for education. In the second half of 2025, I attended three events that highlighted AI’s growing impact on educational institutions and how Blackbaud, our school’s primary software provider, is rising to meet the challenge.

Stop #1: The Conversation Gets Real at bbcon (10/6–8/2025)
If you attended bbcon in October, you witnessed the AI conversation shift from “Should we use AI?” to “How do we use AI responsibly?” Christina Lewellen, CEO of ATLIS, hosted a luminary session. The expert panel she moderated—featuring an independent school leader, an alumni-based software provider, and a lawyer specializing in data privacy—didn’t shy away from the hard questions:
- How do we maximize AI’s benefits while mitigating genuine risks?
- How do we coach students in an age where AI is freely accessible yet often misunderstood?
- And perhaps most critically: what are legal landmines we need to avoid?
The session delivered on what educational leaders desperately need right now: practical strategies, proven frameworks, and the clarity to lead confidently in the age of AI. But one theme emerged loud and clear: Not all AI solutions are created equal.
Intelligence for Good: Where Responsibility Meets Innovation
Nearly 50% of teacher time is spent on non-student-related activities. Let that sink in. Half of our educators’ days are consumed by administrative tasks instead of actually teaching. AI promises to change that equation—but only if we can trust it with our data. When you’re dealing with student information, family financial information, and sensitive health records, “move fast and break things” isn’t just reckless—it can be catastrophic.
49% of teacher time spent on non-student activities
When Blackbaud announced over 70 embedded, sector-specific AI capabilities at bbcon, they weren’t just talking about features—they were making a statement about how AI should be built for education.
As Blackbaud Chief Data and AI Officer Carrie Cobb emphasized: “AI isn’t just transforming technology—it’s transforming responsibility. Because AI isn’t just code. It’s decisions. It’s influence.”
A rush to market with flashy features can’t gloss over critical questions like data privacy, FERPA compliance, and institutional liability. AI solutions need to offer the same level of protection and security controls that our education platforms currently provide. I’m reassured knowing Blackbaud builds AI directly into products our school already uses and trusts—like our SIS and fundraising CRM—with data security and privacy controls baked into every implementation.
From Tools to Partners: Welcome to the Age of Agents
What got me genuinely excited? Blackbaud’s announcement about Agents for Good™—proactive, virtual team members that can autonomously manage entire workflows, make contextual decisions, and handle processes from start to finish while humans maintain control of the decisions that truly matter.
The first Agent for Good is Blackbaud’s Development Agent, which proactively manages donor portfolios and systematic communications. This agentic AI frees up time for development directors bogged down with routine follow-ups, cultivation plans, and donor prospecting, allowing them to focus on the high-touch relationships that truly need the human element.
Unlike other “AI assistants” on the market, these agents are seamlessly embedded in the trusted Blackbaud environment, not bolted-on tools that require constant imports, exports, and data management. They work securely, efficiently, and in sync with our school’s existing data and workflows.
The long-term vision is for Blackbaud AI not just to coordinate its own agents, but to orchestrate agents from across the entire partner landscape. Think about what this could mean for schools: instead of managing multiple AI tools that don’t talk to each other, you’ll have one intelligent system that knows when to deploy the admissions agent, the financial aid agent, the communications agent—whatever’s needed to accomplish your goal.
Stop #2: Seeing Is Believing at the D.C. Meetup (10/23/25)
Here’s what moved this from abstract announcements to concrete reality for me. Shortly after bbcon, I attended a meetup in Washington, D.C., where Blackbaud product managers Jacqueline Koca (Jax) and Jessi Walters delivered hands-on demonstrations of new AI features to about 60 educators and school administrators. The format was intimate, allowing for real questions and honest reactions.
This wasn’t a sales pitch or a polished keynote—it was real users getting their hands on actual tools, asking tough questions, and seeing how these capabilities work in practice.
The demonstrations were eye-opening:
- Reporting a student conduct violation—Blackbaud AI Chat understood context, asked clarifying questions, and properly documented everything in the student record. This was a collaborative conversation with a system that understood the workflow, the compliance requirements, and the sensitivity of the task.
- A side-by-side analysis of user permissions—It’s the kind of administrative work that normally requires someone to dig through screens, compare settings manually, and hope they didn’t miss anything. Blackbaud AI Chat handled it conversationally and comprehensively.
Educators’ skeptical curiosity turned to genuine excitement as they interacted with these features, testing the boundaries and imagining applications for their own schools. The questions weren’t “Can AI really do this?” but “When can we get this in our school?”
The contrast between this intimate, hands-on experience and the broader bbcon stage was striking. At bbcon, we heard about the vision. At the D.C. meetup, we saw the execution—and wow. Just wow.
Stop #3: Shaping the Future at the Advisory Retreat (12/9–12/2025)

Less than two months after the D.C. meetup, I headed to Charleston, South Carolina, for Blackbaud’s annual K–12 Executive Advisory Board retreat. This wasn’t about demonstrations anymore. This was about rolling up our sleeves and helping shape what comes next.
Andy Snyder (SVP of Platform), Paul Finch (Senior Manager of Product Development), Corey Eck (Director of Product Management), and Brandon Stirnaman (AI Lead) walked us through Blackbaud’s AI roadmap with a level of transparency that’s rare in the tech world. But what made this session valuable wasn’t just what they shared—it was that they genuinely wanted our input, our concerns, and our real-world perspective on what they were building. Some highlights include:
Blackbaud AI Chat is already in an early adopter program (EAP) for Blackbaud Raiser’s Edge NXT. The same AI tools designed and developed within that powerful fundraising CRM will be universal across all Blackbaud platforms. No waiting for custom versions to be built for the remainder of the portfolio!
Blackbaud’s partnership with Anthropic creates a secure connection through which Anthropic’s AI assistant, Claude, can interact with your Blackbaud data. During the demo, we watched Claude generate a personalized stewardship email for a $100 million donor by pulling real-time gift data from Raiser’s Edge NXT. It was seamless, secure, and sophisticated. This brilliant audience started dreaming of what they would use the tool to do next.
- Envision a school leader calling on Claude to present a summary on current enrollment goals, or a quick accounts receivable analysis on tuition to contrast it with the current capital campaign.
- Imagine a powerful cross-platform dashboard without running a single list, query, or report. The possibilities are endless!
Then there was “Mary.” Mary is a conceptual admissions AI agent that could answer parent questions, schedule open house visits, and even translate for international applicants. The potential is significant—imagine scaling your small admissions office responsiveness without scaling headcount.
Addressing Privacy and Transparency Concerns
The advisory board didn’t just nod along enthusiastically. We pushed back. And hard.
- What about the “uncanny valley”—the risk of parents or students feeling deceived if Mary was a little too human?
- How do we ensure transparency—that every interaction clearly identifies as AI?
- What happens if students or even staff start forming bonds with AI agents that feel like real people?
These aren’t theoretical concerns. They’re the kind of questions that only people who actually work in education every day would think to ask. And this is exactly why Blackbaud embraces advisory boards.
The product team didn’t dismiss these concerns—they leaned in. We talked about evaluation systems to let school administrators review AI interactions for safety concerns like bullying or inappropriate use. We brainstormed use cases for potential student retention agents. We dove deep into the parent contact center concept, exploring how AI could handle the most common billing questions (payment inquiries, login issues, fee waivers) to dramatically reduce call volume. Removing human interaction in those cases could potentially give families support without requiring awkward phone calls during challenging times.
Blackbaud launched a Trust Center at trust.blackbaud.com as a single source for AI compliance, security documentation, and audit reports. It’s the kind of resource that sounds like claptrap until you’re the one who must answer questions from your board of directors about data security and AI governance. I quickly shared this newfound resource with a hawkish peer, and his immediate response was “That’s awesome. I wish every vendor would adopt Blackbaud’s trust vendor domain model.”
This is how responsible AI development should work: not in a vacuum, but in partnership with the people who will actually use it and live with the consequences.

The Bottom Line: Forging Ahead with Trust in Mind
Six months ago, I wrote that AI in education “isn’t the story of robots replacing teachers—it’s the story of teachers getting their weekends back.” That’s still true, but now I’d add a critical caveat: AI solutions should be built with your unique needs, constraints, and responsibilities in mind.
As Mike Gianoni, Blackbaud’s president and CEO, put it: “To truly unlock AI’s promise for the millions of people and organizations across the social impact sector, we need more than generic tools—we need solutions designed for their unique business operations that deliver immediate results without adding complexity.”
The AI revolution is accelerating, and schools can’t afford to sit on the sidelines. But they also can’t afford to rush into solutions that prioritize innovation over security, or convenience over compliance.
Last summer I surveyed educators who were cautiously optimistic about AI’s potential. A short six months later, the ones partnering with platforms that take both capability and responsibility seriously are delivering on that potential—reclaiming time, improving student experiences, and maintaining the trust their communities expect from them.
To join me and other education leaders in the AI conversation, register for Blackbaud’s K–12 User Conference in Boston July 15–17, 2026.
This is the fourth installment in a series about AI in education. Check out the three previous posts here:
- Transforming K–12 Education: Insights on AI from Industry Leaders
- Blackbaud Is Reimagining the Future of Education with AI
- How Blackbaud AI Empowers Private Schools to Transform Fundraising Through Innovation
