The One Question Most Program Reviews Forget to Ask

The One Question Most Program Reviews Forget to Ask

Most program reviews happen on autopilot. The director looks at retention numbers, registration numbers, revenue, maybe a few coach evaluations. The review answers operational questions. Did we hit our goals? Did we stay on budget? Are we growing? Those are real questions, and they deserve attention.

What almost never gets reviewed is the question that quietly determines whether the program will exist in five years: are kids still loving this?

Joy is the leading indicator for everything else in youth sports. Kids who love the game stay in the game, recommend the program to friends, and put up with the inevitable rough patches. Kids whose joy has been slowly drained by the program leave eventually, no matter how strong the curriculum or the coaching. By the time the retention numbers reflect the joy problem, the program has been losing the kids it would most want to keep for two or three years.

The annual joy protection review fixes this. Once a year, in the offseason, the director walks the program through a structured review of whether the calendar, the offerings, the coach expectations, and the parent messaging are helping kids love the game longer or quietly turning it into work. It takes a few hours. It uncovers things no operational review will catch. And the right time to run it is right now, in the window between seasons.

When to Run It

The right window is the four to six weeks after a season ends and before the next one starts ramping. Late enough that the recent season is fresh, early enough that any changes can be implemented for the next cycle. For most programs that's late spring/early summer (after spring season) or late summer (after summer season).

Running it during an active season is the wrong call. There's no time to act on what surfaces, the staff is in execution mode, and the review becomes a complaint session about things that can't be changed mid-stream. The annual review needs the offseason calm to be useful.

Most programs benefit from putting it on the calendar as a recurring annual workflow, the same way they'd schedule a budget review or a board meeting. Once it's a system, it gets done. When it depends on the director remembering to run it, it gets pushed.

The Four Domains

The review covers four areas. Each one corresponds to a different way joy gets quietly drained from a program over time. Run them in order, with about thirty minutes spent on each.

1. The Calendar

Pull up the past season's calendar and walk through it like a parent would. How many weekends were committed? How many evenings? How many tournament weekends in a row at any point? How many weeks of "down time" actually existed where athletes had no required programming?

The questions to ask, honestly: did our calendar leave room for kids to be kids? Did athletes have unstructured weekends where the program asked nothing of them? Did we add anything this season because parents asked us to (extra clinics, optional skills sessions, weekend tournaments) that pushed the calendar toward "always on"?

Most programs find that the calendar has crept tighter year over year, usually for defensible reasons. Each individual addition felt small. Cumulatively, the season got heavier. The annual review is the one place this drift gets caught.

The fix, when this domain shows drift, is usually a calendar diet for the upcoming season. One fewer optional clinic. One fewer tournament weekend. A protected stretch in the middle of the season where the program intentionally backs off. Athletes notice the breathing room. Parents notice it too.

2. The Offerings

Walk through every program offering and ask one question: is this here because it serves athletes, or because it's been here a while?

The honest answer matters. Programs accumulate offerings the way closets accumulate clothes. The Tuesday clinic that started six years ago for one age group still runs, but for an age group it no longer fits. The optional skill session got added during a registration push three years ago and never got reviewed. The summer camp expanded by two weeks because the staff wanted more revenue.

Each of these can be valuable, and each of them can also have quietly become a load-bearing wall for the program's "we need to be doing more" anxiety. The offerings audit asks, for each item: if we were starting this program from scratch this year, would we add this? If the answer is no, it's a candidate for retirement.

The fix here is usually the courage to retire one or two offerings rather than the impulse to add new ones. The program gets simpler. The athletes get more space. The staff has more bandwidth to do the remaining offerings well.

3. The Coach Expectations

What did we ask coaches to do this season, and how did that translate to what athletes experienced? This is the domain most programs review least, and it's often where joy gets drained fastest.

The questions to ask: did our coaches have permission to let practice be light when athletes needed light? Did we measure them on outcomes that protected joy (athlete engagement, return rates, parent feedback) or on outcomes that competed with joy (game-day intensity, win rates, technical drilling minutes)? Did we communicate, explicitly, that coaching joy was part of their job?

Programs that run this part of the review honestly often find that they've been telling coaches the right things in onboarding (we believe in player development, we want kids to love the game) and then measuring them on the wrong things in performance review (won-loss record, advanced rep volume). When the stated values and the measured outcomes don't match, coaches calibrate to the measured outcomes. Joy quietly gets sacrificed to whatever is on the scorecard.

The fix is making the joy outcomes visible in how coaches are evaluated. Athlete return rate per coach. Parent feedback on coach communication. Number of practices that ran with a "fun" element. Whatever the program can actually measure, even imperfectly, becomes part of how the coach gets reviewed. What gets measured gets coached.

4. The Parent Messaging

What did we tell parents this season, and what did the cumulative weight of those communications signal to them about what we believed?

Pull up the season's communications. The welcome materials, the registration confirmation, the mid-season updates, the end-of-season letter, the tournament emails, the parent meetings. Read them as a body of work.

The questions: did the messaging help parents stay calm and supportive of their kids, or did it subtly amplify their anxiety? Did we frequently use language about competition, ranking, level, advancement, and select-team pathways without balancing it with language about development, joy, and the long arc of the kid's experience? Did our communications model the calm, low-anxiety approach we want parents to bring to the sideline?

Most programs find that their messaging skews more anxious than they realized. Each individual communication had a reason. Cumulatively, the parent group got pulled into a slightly more competitive, slightly more striving headspace than the program intended.

The fix is rebalancing the next season's communications. For every email about competition or advancement, one about joy or long-term development. For every reminder about a tournament, one about the kid who's having fun. The cumulative tone shifts. Parents calibrate to it. Athletes feel the difference.

The Half-Day Workshop

The most efficient way to run the review is as a single half-day session with one or two key staff members. Block four hours. Bring printed copies of the calendar, the offerings list, the coach evaluation criteria, and a representative sample of the parent communications. Walk through the four domains in order, taking notes on what surfaces. Close the workshop with a written list of the three to five changes the program is committing to for the upcoming season.

Doing it as a half-day matters. Spreading the review across four separate meetings, one per domain, almost never works. The momentum dies, the connections between domains don't get made, and the review becomes a series of isolated observations instead of a coherent picture.

The output should fit on one page. Three to five concrete changes for the next season, each one tied to a domain, each one assigned an owner and a deadline.

What Most Programs Find

Three patterns show up across most programs running this review for the first time.

The first is calendar creep. The season has gotten heavier than the director thought, usually because each individual addition felt reasonable. The cumulative weight is what surfaces in the review.

The second is the orphaned offering. There's almost always one program element that nobody would invent today but that keeps running because nobody's challenged it. The review is permission to retire it.

The third is the messaging drift. The program's communications have skewed slightly more anxious, slightly more competitive, slightly more striving than the staff intended. Re-reading the season's emails as a body almost always surprises the director.

None of these are crises, and all of them are recoverable. Catching them in an annual review is dramatically less expensive than catching them three years later when retention numbers tell the story.

The Long Game

Programs that run the joy protection review every year develop an asset most of their competitors don't have: a calibrated read on whether the program is still doing the thing it's supposed to be doing. The athletes feel it. The parents feel it. The coaches feel it. Word gets around, in the way word gets around in a parent community, that this program protects something other programs don't.

That word-of-mouth advantage is hard to manufacture by any other means. It compounds. And it's the closest thing to a moat a youth sports program can build.

Joy is the asset. The annual review is the maintenance schedule. The window to run it for next season is open right now.

Program Director's Playbook - Newsletter Footer
1 of 3