Build stronger teams through better communication. Download “A Coach’s Guide to Winning Communication” today.

All In on AI: What K-12 Leaders Are Really Doing With Artificial Intelligence

“AI will save you time” is the promise everyone’s heard.

But in K-12, the better questions are:

  • What will we do with that time?
  • Can we trust these tools with our data and our families?
  • And will AI widen gaps—or actually help us close them?

In our recent webinar, “Voices of Innovation: All In on AI for K-12 Professionals,” moderated by former Chief Technology Officer and now ParentSquare team member Bryan Phillips sat down with two K-12 leaders who are wrestling with those questions every day:

Dr. Michael Lubelfeld, Superintendent of North Shore School District 112 (IL), nationally recognized speaker and co-author of several books on leadership and student voice

Alana Winnick, Educational Technology Director & Data Protection Officer at Pocantico Hills CSD (NY), author of The Generative Age, and founder of Students for Innovation

The conversation wasn’t about shiny tools as much as it was about students, families, and trust. Here’s a look at what they shared—and what it means for districts trying to use AI responsibly.

It’s not about “saving time.” It’s about buying back human moments.

Alana was blunt: she’s tired of vendors promising that AI will “save teachers time” without talking about what happens next.

If AI helps write the lesson, students use AI to do the assignment, and then AI offers to grade the work…what are we really accomplishing?

Female High School Teacher With Digital Tablet Helping Students In STEM Technology Lesson

Her point: the goal isn’t just efficiency. The goal is to buy back time for the work only humans can do—building relationships, sitting next to a student who’s stuck, calling a family that’s gone quiet, noticing the kid who’s about to slip through the cracks.

Michael has seen something similar from the superintendent’s chair. In his district, AI is less about shaving minutes off a task and more about changing how time gets used:

  • Principals are using AI to tighten and modernize presentations to staff, students, and families so they can focus more on the conversation and less on the formatting.
  • District leaders are using AI to quickly sift through PDFs and spreadsheets, not to replace their judgment, but to get to the “interesting questions” faster.

As Michael put it, the “human in the loop” isn’t optional. It’s the whole point.

What AI looks like in real classrooms and district offices

Throughout the webinar, the examples stayed very grounded.

Alana shared a story about running student transcripts through AI to flag graduation issues. Humans still reviewed everything, but the AI surfaced one student who would have been missed. That’s not a theoretical efficiency gain, that’s a real student whose path changed.

On the instructional side, she uses AI with teachers who feel like their tried-and-true lesson “just isn’t landing anymore.” Instead of starting from scratch, AI helps:

  • Find new entry points
  • Adjust for different reading levels
  • Personalize examples for the specific group of students in the room

Michael talked about teachers using AI to co-write feedback on student writing. They still use their own rubrics. They still decide on the grade. But instead of writing 25 lengthy comments from scratch, teachers can ask AI to generate a first draft of feedback on one or two specific rubric criteria—then revise it in their own voice.

On the communication front, AI has become part of a weekly rhythm. In Alana’s district, teachers are expected to communicate with families at least once a week. Long newsletters can be a significant burden, especially for early adopters who are already juggling a multitude of tasks. Now, teachers drop the key points into AI and let it help shape a clear, friendly message. Many have shifted from long weekly newsletters to shorter, more frequent posts with photos, which parents actually prefer. ParentSquare is at the center of that flow in her district—making it easy to send those updates, translate them, and keep everything in one place.

AI as an equity tool: language, access, and the new digital divide

Both districts serve diverse communities, and that shaped how they think about AI.

Alana’s district sits in Sleepy Hollow, NY, where low-income housing and generational wealth share the same zip code. Her community is only about 38% white, with a large multilingual population. Many students “graduate” from English learner status, but their parents do not.

Historically, non-English-speaking families often stayed away. If you don’t understand what’s being said at back-to-school night, it’s difficult to participate in the conversation. 

Now, tools like ParentSquare and other AI-powered translation and speech-to-text tools are changing that dynamic:

  • Families receive messages in their home language.
  • They can respond in that language and be understood.
  • At events, headsets and translation tools make it possible to actually follow what the teacher or principal is saying.

Engagement among multilingual families has climbed because the barrier isn’t language anymore.

She told another story about two students who arrived from Pakistan speaking only Urdu. Within days, the older student was engaging in science and social studies content in Urdu, focusing on learning the concepts instead of decoding an unfamiliar language. For students with dyslexia or dysgraphia, AI is doing similar work in reading and writing.

Michael framed it with a simple metaphor: equity is the right pair of shoes. Right size, right type, right purpose. For him, generative AI is one more way to give students and teachers the “shoes” they need—whether that’s translation, extra practice, enrichment, or personalized feedback.

But both leaders also warned about a new kind of digital divide: not just who has devices or home internet, but who has access to AI skills and experiences.

Students in one grade might have a teacher who embraces AI and uses it thoughtfully. The next year, they might land in a classroom where AI is banned or ignored. Students notice that zigzag, and they’re already telling Alana it doesn’t feel fair.

Her worry is straightforward: graduates who never learn how to work alongside AI will be at a real disadvantage in the workforce—regardless of how smart or capable they are.

Guardrails, Governance, and Teaching AI the Right Way

Neither leader is casual about privacy, policy, or ethics.

Michael described a landscape of COPPA, SOPPA, FERPA, state data privacy laws, and board policy—all of which apply just as much to AI tools as to anything else. Vendors in his district must sign strict agreements before their products are used, even if only teachers are logging in. His team started with guidelines and is now working on formal policy and procedure, grounded in existing curriculum and technology frameworks.

Alana, as a Data Protection Officer, leans on New York State contracts that pre-vet software for compliance. She’s confident there are good, compliant AI tools available; sometimes it’s just a matter of knowing which vendors to push and what to ask for.

But governance isn’t only technical. It’s instructional.

Alana gives every AI user—student or adult—two simple “jobs”:

  1. Be a detective. Look for bias and misinformation. Assume AI can be wrong or skewed, and back it up with other sources.
  2. Be a DJ. Remix the output. Use your own ideas, your own voice, your own judgment. Never just copy-paste.

Those lessons start as early as first grade. By third grade, her students are confident enough to talk about AI bias and hallucinations on national panels.

So, where do you start?

Both speakers agreed: you don’t start with a massive initiative. You start small and human.

Alana often begins by having teachers try AI on something personal and low-stakes such as planning a vacation, building a meal plan, or mapping out a workout. Once they feel the benefit in their own lives, they’re more open to using it professionally.

From there, she nudges them toward tweaking work they already do: translating a message for families, differentiating a reading passage, refreshing an old lesson. Michael suggests starting with one class, one assignment, one part of the workflow—always with the teacher in control.

Building leaders matter a lot here. If the principal is on board and talking about AI as part of the school’s core work—not “one more thing”—teachers are far more likely to lean in.

ParentSquare’s Role in Responsible AI Adoption

Threaded through all of this is the idea of infrastructure. AI by itself doesn’t build trust. The way it’s embedded into daily tools and workflows does.

ParentSquare helps districts:

  • Unify communication across district, school, and classroom so messages are consistent and easy to find
  • Reach every family with built-in two-way translation and accessibility features
  • Use AI in clear, contained ways, like generating alt text on websites or helping staff draft communications—while keeping humans fully in charge of tone, content, and context

Our goal isn’t to replace people. It’s to help them do more of the work that only they can do: building trust, strengthening relationships, and engaging every family.

Want to Hear the Full Conversation?

This recap only covers part of what Alana, Michael, and Bryan explored—from policy and board conversations to student voice, rural vs. affluent contexts, and what they hope school will look like five years from now.

🎧 Listen to the full “Voices of Innovation: All In on AI for K-12 Professionals” webinar to hear the stories and examples in their own words.

Share This Post