
Steering Toward the Future: Andover’s Framework for 21st-Century Learning
A conversation with Head of School Raynard S. Kington, MD, PhD, P’24, P’27, and Nicholas Zufelt, Instructor in Mathematics, Statistics, and Computer Science
Steering Toward the Future: Andover’s Framework for 21st-Century Learning
Last night, I attended an evening discussion in Boston featuring Phillips Academy Andover’s Head of School, Dr. Raynard S. Kington, and Computer Science instructor Nicholas Zufelt.
As an alumnus (class of ‘91) and an investor in AI startups, I wanted to hear more about the school’s commitment to teaching and learning in the AI era.
Their conversation about how Andover is reimagining education for the AI era—focusing on competencies over content, critical thinking over rote learning, and embracing AI as a supportive tool rather than fearing it—offered valuable insights into the future of teaching and learning.
What follows is the transcript from the evening.
Note: The hero image is AI generated using fal using a photo of the empty room/stage that I took before the talk began using the fal-ai/qwen-image-edit/image-to-image model using the prompt:
add ai robots in seats listening to human speakers give talk about ai in classroom. show electric waves connecting the robots and human minds like lightning glowing energy
Event Details
- Steering Toward the Future: Andover’s Framework for 21st-Century Learning
- Head of School Raynard S. Kington, MD, PhD, P’24, P’27
- Nicholas Zufelt, Instructor in Mathematics, Statistics, and Computer Science.
- Date: Tuesday, October 7, 2025, 6–8 p.m.
- Location: Sheraton Boston Hotel, Commonwealth Room
A Conversation with Phillips Andover’sHead of School Raynard S. Kington, MD, PhD, P’24, P’27, and Nicholas Zufelt, Instructor in Mathematics, Statistics, and Computer Science
Kington: The Learning Steering Committee is focusing on teaching and learning activities at the school. Over the last couple of years, we’ve started a whole series of initiatives. We started departmental reviews where outside people come in and give us input about what we might be doing differently. We’ve had discussions among faculty about the competencies in each department.
So, tell us about Computer Science — what you think are the goals and the opportunities at this time when we’re looking at everything.
Zufelt: Absolutely. Hello everyone — so great to see you. This is my first alumni event that I’ve been to where I actually know people, and I know so many of you.
In Computer Science, about five or so years ago, just post-COVID, we began a change that was all about learning competencies, like you’re talking about here. We identified that our core goal was to codify what exists in the world of computer science outside of programming.
Everyone who’s spent time doing computer science would agree there’s so much more to CS than programming. But actually naming those skills — those competencies — was a monumental task because so much of it was stuff we’d just say in passing: “Here’s a trick, let me show you this trick.” When you put all those tricks together, that makes a massive difference between students who find success and those who don’t.
So, the process was really a deep introspection into who we are and why — and how we can name those skills so we can elevate them to the same level as programming lessons. Now we have fifteen competencies in computer science — one of which is programming — and the rest focus on supporting your thinking process, your collaboration, and a lot of the knowledge and “goodness” pieces that show up there as well.
Our primary focus was to level the playing field and give more people access to success in computer science. Then, to force ourselves as teachers to make changes in the classroom, we shifted the curriculum.
We had heard from students that our old course, AP Computer Science, was a kind of gatekeeper — a year-long course that didn’t fit easily into schedules until upper or senior year. Students said that was a pain point. So, we redesigned everything.
The new curriculum is a skills-first, elective model — different classes that you can take. It forces us as teachers to be responsive, because we can’t assume you took HTML or Python or any particular intro class. We have to focus on skills rather than programming. It’s a nice “forcing function” for us — but it really comes from that skills-first approach.
Kington: We hear about how the world is changing all the time, and technology permeates everything. That must make it hard to decide what teaching and learning you want in Computer Science, because you’re at the center of so much of that change. How do you respond to that in your curriculum?
Zufelt: A lot of what students do in the classroom mirrors what they might do in the tech industry — but in some classes it feels very different. The primary way we’re fueling innovation and adapting to change is by offering lots of different kinds of courses.
For example, a student might take iOS App Development, learning about the impact of their work on potential users and customers — very different from someone in Cryptography, who’s learning about mathematics and how to keep information secure and safe.
One beautiful thing about the new curriculum is that every student’s experience is unique. You might overlap with a peer in Web Help and Cryptography, but they didn’t take Robotics and you did. Everyone builds their own experience in computer science — which we think is really exciting.
Kington: I was impressed that you included in your topics those “soft skills” that everyone talks about — and I’ll admit, computer scientists have a reputation for not always having those soft skills. How do you communicate their importance to students who might not even get that?
Zufelt: One clear thing we do to emphasize the importance of all skills — especially non-programming skills — is by not having a placement process or letting students “jump ahead” just because they’ve coded before.
One of my colleagues says, “People program for fun; people do computer science for fun.” The difference is this: maybe you have a cool project you want to build, but do you rigorously test it? Do you do market analysis? Do you worry about data security? These are the real-world questions.
We’re trying to show students that this is the curriculum. The product-building happens almost accidentally while you’re working through those larger, more important questions.
Another thing we do: we give names to skills. When you name something — like “experimental design and observation” — students start recognizing it.
For example, last week during an oral exam, a student was explaining a line of code and said, “Actually, I’m not sure whether the screen starts at the top or bottom.” She realized she didn’t know and panicked for a second. I asked, “Okay, how are you going to figure it out right now?” She paused and said, “Wait — this is experimental design and observation, right?” Yes!
Then she said, “I’ll change this value, see if the ball bounces off the ceiling or floor.” She ran the code, discovered the answer, and learned something during an assessment. That’s the kind of creative problem-solving we’re aiming for.
Kington: Everyone’s talking about AI — and you’re one of our go-to people. How should we approach AI and its use in the classroom?
Zufelt: I spend a lot of time with educators, on and off campus, thinking about AI. Most of the conversation revolves around fear — fear that AI will replace our thinking, whether you’re a student, teacher, or graduate.
There’s plenty of research already suggesting AI can deteriorate the “thinking tool.” So yes, people worry that AI will replace your brain. But I try to push people to think of AI as supporting your brain.
For example, I do one-on-one oral exams as assessments. I tell students: use AI while studying; I’ll coach you on how to use it well. But when we sit together, you need to look me in the eyes, talk, share your code — all those soft skills come in.
Students are nervous — I tell them that’s fine, that’s human.
To support them, I even created an AI prompt that simulates me. It says something like: “I’m a student in a computer science class. I’ve written this code. In a week, my teacher will quiz me on whether I understand each line. Please pick a line, ask me to explain, then give me feedback.”
This is AI as a simulation tool. It forces thinking, not replacement.
When I work with teachers, I use this framework: “AI as ___.” AI as a simulator, tutor, student, manager, etc. For example, AI as a manager could help you prepare for a tough conversation with an employee. Describe the scenario to AI, brainstorm how it might go wrong, and plan your approach.
We need to be creative with AI — not fearful. Think of it as a supportive tool.
Kington: I know that in a lot of educational settings, they’re moving toward “curated” AI spaces to reduce hallucinations. Can we talk about hallucinations?
Zufelt: Please — I can’t have a discussion on AI without hallucinations!
Here’s a hot take: I love that AI hallucinates.
If AI didn’t hallucinate, we wouldn’t need to exist. It forces students to stay critical. In any industry, critical thinking is crucial.
At Andover, every department claims to teach critical thinking — and rightly so. It’s one of our most important goals.
So I tell students: AI hallucinates just like your Uncle Norm hallucinates.
Kington: (Laughs) I’ll write that one down.
Audience Member (Bo): You mentioned critical thinking. How do you really teach it to young students? Can you give examples? I’m struggling myself to teach my team.
Zufelt: Sure — I’ll give you a metaphor.
When you go to the gym, you can use a cable machine or free weights. Cable machines are controlled — they guide your motion. Free weights are loose — you have to stabilize yourself.
Critical thinking is like those stabilizing muscles. It’s what keeps your thinking balanced and adaptable.
So, I try to create classroom experiences that are more like free weights — less controlled, more open-ended. Students need some structure at first, but eventually, we take off the training wheels.
Every one of my classes ends with a student-designed final project. That’s where real critical thinking happens.
If you give students only worksheets, they’ll work hard — but they won’t build those “stabilizing” muscles of thought.
Q&A
Audience Member (Dorothy): Does Andover have a vision for the next five to ten years? With AI evolving, will you continue teaching programming and coding, or move on?
Zufelt: A few things to keep in mind: Andover is a high school. Our students go on to college, where they’ll focus more on job-ready skills. Andover doubles down on the liberal arts — thinking broadly and deeply.
That’s why Computer Science is thriving even as programming, in some ways, is being automated.
Teaching programming is like teaching how to lay bricks as a mason. If a tool can lay bricks, are you no longer a mason? Of course not — there’s more to it.
So yes, we’ll continue to teach programming — because it’s the medium through which we teach the real, higher-level skills. But we also work to convince students that programming isn’t the hard part. The thinking is.
Audience Member (Eric, Class of ’92): In 1992, eight students took Computer Science. What does it look like now? How many students? And is it theoretical, applied, or both?
Zufelt: We now offer about ten sections per term, three terms per year — about thirty sections total. Most are full.
Many students take multiple CS classes, so it’s not every student, but far more than in 1992.
We take a “yes, and” approach — we offer highly theoretical courses and very applied ones. Each year, one of our faculty learns a new topic over the summer and launches a new course.
We welcome students who love math and theory, those who love building things, and those who want to design for real users. We’re not taking any single lens — we’re surveying the entire discipline.
Audience Member (Kylie, Class of 2020): How are you teaching students to prompt AI?
Zufelt: There are lots of great examples across campus. In History, Dr. Keri Lambert has students work with primary sources — then prompts AI to create a fake primary source in that style. Students then compare and analyze how the AI got it wrong.
In my own field, it’s about metacognition — knowing what you know and how that shapes your interaction with AI.
One favorite example is the “seven-prompt essay.” The teacher gives a normal essay prompt. The student feeds it to AI, gets an essay, then writes six more prompts to improve it.
What does the teacher grade? Not the essay — the prompt log.
It measures the student’s ability to think critically and edit, not just write. It’s a creative, authentic way to assess learning in the AI era.
Audience Member (Koshy): What about students who arrive already using AI, probably without guidance? How can Andover address that?
Zufelt: That’s a great question. One of Andover’s strengths is that we already have structures to meet students where they are.
Yes, AI makes this more visible — but students have always struggled with critical thinking to some degree. So it’s not new.
The answer is scaffolding — giving more support early, and gradually removing it as they gain skill. We already do this in ninth-grade courses, for example.
Interestingly, AI has pushed teachers to rethink assessment. You can’t just assign essays anymore — you have to identify and measure the actual skills you care about. That’s a good thing.
Some teachers are going back to handwritten, in-class work — but even that can be creative. You can cut up your essay, rearrange it, do “unplugged” editing. There’s so much we can still do with paper, pens, and human brains.
Closing Remarks
Kington: Thank you so much, Dr. Zufelt. And thank you all for being here tonight.