From Compliance to Coaching: What School Leaders Should Understand About Evaluating Specialized Staff
- Josh Morgan

- Mar 21
- 6 min read
![]() | One of the most important leadership shifts I have made is this: evaluation systems work best when they help people grow, not just prove that they did something. That matters in every role, but it matters especially for specialized staff. |
School systems often do a better job evaluating traditional classroom instruction than they do evaluating the work of specialized educators whose impact happens across settings, through consultation, through problem-solving, through documentation, through coaching, and through supports that are not always visible during a single classroom visit. The Non-Instructional/Itinerant framework I have been using in my daily work is built on that exact idea. It is designed as a multiple-measure system to coach, develop, and evaluate non-instructional and itinerant teachers, using observations, professionalism, and student growth rather than relying on one narrow view of practice.
That design matters because strong leadership requires evaluation systems that fit the work.
Specialized roles are real instructional roles, even when the work looks different
One of the biggest mistakes school leaders can make is assuming that all educator impact should look the same.
Specialized staff often influence student outcomes in ways that are distributed across a school day or even across multiple sites. Their work may include direct services, consultation with staff, support for families, planning with teams, documentation, compliance monitoring, progress monitoring, professional development, and system-level problem-solving. A strong framework recognizes that these roles still require deep expertise, high expectations, and measurable impact, even if that impact is not always captured in a traditional classroom lesson snapshot.
That is one reason this framework stands out to me as a leader. It explicitly includes both observable practice and off-stage evidence. The guidance explains that observations may include live events such as small-group support, staff professional development, or team data meetings, but can also include artifact review meetings focused on records, assessment data, planning strategies, and other evidence tied to the role. It makes clear that observable and artifact-based evidence are equally important.
That is a leadership lesson in itself. If leaders only look for what is easiest to see, they can miss what matters most.

Better evaluation starts with a better understanding of the role
What I appreciate most about this kind of structure is that it asks leaders to slow down and define the work clearly.
The framework centers three broad areas of professional practice: mastery of the role, the learning environment, and planning, delivering, and monitoring services, alongside professionalism and student growth. It expects specialized staff to use evidence-based practices, reduce barriers to learning across home, school, and community settings, use multiple data sources, monitor and adjust services, and support equitable access for students.
For leaders, that is important because clarity improves coaching.
When the expectations are vague, evaluation becomes reactive. When the expectations are clear, coaching becomes more precise. Leaders can identify strong practice, name growth areas more specifically, and help staff connect their daily work to broader outcomes for students and teams.
Compliance matters, but it cannot be the whole story
Specialized roles do carry significant compliance responsibilities. That is real, and it matters. Timelines, documentation, confidentiality, communication, and legally aligned services are all essential parts of the work. The deeper guidance around the framework reflects that, including examples tied to timely services, coordination with teams, consistent monitoring of accommodations, confidentiality, meaningful goals, and comprehensive documentation.
But leaders make a mistake when they stop there.
If evaluation becomes only a check for whether paperwork exists, then schools miss the larger purpose of evaluation.
Strong evaluation should help answer deeper questions:
Are services actually reducing barriers for students?
Is staff collaboration improving instruction and access?
Are supports connected to meaningful student goals?
Is the educator helping build capacity in the adults around them?
That broader lens is built into the role expectations. The framework includes collaboration with staff and families, support across settings, use of data to design services, and adjustments to services based on student progress.
That is why I keep coming back to this idea: compliance is necessary, but growth happens when leaders coach beyond compliance.
Evidence should reflect the actual work
Another leadership lesson in this work is that evidence collection has to reflect the real daily responsibilities of the role.
The deeper framework materials provide examples of evidence that are much more connected to the actual work specialized staff do: progress monitoring data, consultation logs, IEP goals, service plans, lesson plans, meeting notes, communication logs, family outreach, collaboration records, and follow-up documentation.
That matters because specialized staff often do some of their most important work off-stage. They reduce barriers through planning, coordination, training, and communication before a student ever experiences the visible result in a classroom or service setting.
As a leader, I think that should change the way we talk about evidence. Evidence should not be treated as extra paperwork created for evaluation. It should be the natural record of thoughtful, high-impact practice.

Coaching requires shared understanding, not just a rubric
A framework alone is not enough. Leaders still have to help people make sense of it.
That is one reason I developed and used a PLC structure around this work. The purpose was to build shared understanding of expectations, strengthen evidence-based practice, and create a supportive professional community where staff could clarify questions, reflect on challenges, and plan next steps for continuous improvement.
The PLC plan focused on things leaders often overlook but staff actually need:
making sense of how the system differs from a traditional classroom rubric
examining strands of the framework in smaller groups
identifying high-quality evidence and how to organize it
discussing timelines, observations, and feedback processes
solving practical challenges related to data collection, scheduling, and service delivery across sites
building systems for ongoing collaboration and continuous improvement
That has been one of the strongest lessons for me as a leader. Staff do not need more vague encouragement around evaluation. They need clarity, examples, structures, and space to think together.
Strong evaluation systems should build capacity
The strongest evaluation systems do not just sort people. They build people.
One part of the framework that stands out to me is how clearly it connects teacher leadership and peer capacity-building to broader school improvement. At stronger performance levels, the guidance points to building communities of practice, supporting colleagues in reaching shared goals, using inquiry cycles, collaborating with school leadership teams, and designing or improving systems that affect school change.
That is the part that feels most aligned with my leadership values.
I do not want evaluation to be something that happens to staff. I want it to be part of how we strengthen adult practice, clarify impact, and build better systems for students. When evaluation is done well, it becomes one more way a school develops capacity.
What this means for my leadership
I am using this work daily, and that is exactly why it has shaped my thinking so much.
It has reinforced for me that leadership is not just about holding people accountable to expectations. It is also about making those expectations understandable, evidence-based, role-appropriate, and useful for growth. It has reminded me that specialized staff need leadership systems that honor both the visible and invisible parts of their work. And it has made me think more deeply about how leaders create structures for coaching, reflection, collaboration, and continuous improvement.
For me, that is the real shift from compliance to coaching.
Compliance asks whether the form is done.Coaching asks whether the work is getting stronger.Leadership should care about both, but it should never stop at the first one.

![]() | Specialized staff are often evaluated through systems that do not fully reflect the breadth of their work, especially the consultation, collaboration, planning, documentation, and off-stage service design that shape student access and staff capacity. |
![]() | A stronger evaluation approach uses multiple measures, clear role-specific expectations, observable and artifact-based evidence, and collaborative structures that help staff understand the framework, organize evidence, reflect on practice, and grow through coaching. |
![]() | When leaders evaluate specialized staff through a more accurate and growth-oriented lens, schools gain better clarity, stronger coaching conversations, more meaningful evidence of impact, improved adult capacity, and a more useful connection between evaluation and student support. This helps move evaluation from a compliance exercise toward a system for professional growth. |








Comments