[00:01] SPEAKER_02:
Welcome to Principal Center Radio, bringing you the best in professional practice.
[00:06] Announcer:
Here's your host, director of the Principal Center and champion of high-performance instructional leadership, Justin Bader. Welcome, everyone, to Principal Center Radio.
[00:15] SPEAKER_01:
I'm your host, Justin Bader, and I'm honored to be joined today by Victoria Bernhardt. Victoria is executive director... of Education for the Future and is Professor Emeritus at California State University Chico. She is passionate about her mission of helping educators continuously improve teaching and learning by gathering, analyzing, and using data.
[00:39]
And we're here today to talk about her new book, Measuring What We Do in Schools, how to know if what we are doing is making a difference.
[00:48] Announcer:
And now, our feature presentation.
[00:50] SPEAKER_01:
Dr. Bernhardt, welcome to Principal Center Radio.
[00:52] SPEAKER_00:
Thank you, Justin. It's my honor to be here.
[00:54] SPEAKER_01:
Let's jump right in to the core message of the book. Obviously, we're no strangers to measurement in education. We want to know if what we're doing is making a difference. But what's the core of your message in the book, measuring what we do in schools?
[01:08] SPEAKER_00:
Well, my core message to educators is this. All of us are capable of using data to improve the work we do every day And to know if what we're doing is making a difference. In fact, if we take that time to really look at the data and continuously improve on that basis, the results can be profound. Teachers can become more effective in their teaching. Students can become more engaged in their learning. and the school can make a collective impact on both teaching and learning.
[01:36] SPEAKER_01:
Well, Victoria, I know a couple of weeks ago on the podcast we had John Hattie, who's known for saying, know thy impact, and who's known for doing these meta-analysis studies of different instructional strategies, different approaches to teaching, and knowing the impact that those strategies have, and really has the message that we as educators should know the impact that we're having. But I think kind of a different lens on that topic of knowing your impact is knowing the impact of the programs that you're putting in place in your school. And I know personally, as a principal, there were probably half a dozen different programs that we adopted or implemented to varying degrees of fidelity during my time as principal and often. The implementation keeps us so busy that we don't really stop and take the time to really assess how we're doing with that implementation and really evaluate that program.
[02:27]
And a lot of my academic background, my higher ed training, is in program evaluation and understanding how organizations can assess programs. Often that's done through outside partners. But I'm interested in this idea of internal evaluation, that we, within a school, can look at a program that we've put in place or that we're working to implement and really collect some good data and ask ourselves some good questions about how that program is working. So I wonder if you could frame that issue of program evaluation within the school context for us a little bit and talk about what schools can do to assess the programs that they're implementing.
[03:05] SPEAKER_00:
I think, Justin, you hit the nail on the head with one of the challenges of staff looking at if what they're doing is making a difference. And that is a lot of staff members think, first of all, they can't do the evaluation. You have to go to the outside in order to have a valid evaluation or understanding of what we're doing. is making a difference. So in the book, Measuring What We Do in Schools is a program evaluation tool that could help staff look at their programs. And one of the first things it does is helps you set up the program to be implemented with integrity and fidelity.
[03:44]
In other words, implemented the way it was intended to be implemented, which is very, very important because one of the first things that staff find when they try to evaluate a program is that each staff member implements it in a different way. And so then you go, I don't really know. It depends on this and that. So you cannot really evaluate a program that you cannot describe. So this program evaluation tool can help you set up a program to be implemented in the same way. And if you can set up that description to be implemented in the same way, you can also monitor that implementation.
[04:22]
and then come back and evaluate it at the same time. So the description actually sets up your evaluation. What is the purpose of this program? What are the outcomes? In other words, what should you expect if you're implementing with integrity and fidelity? And those outcomes should really lead you to the data that you would need to bring to bear to understand if it's making the difference you want to see for all students.
[04:48] SPEAKER_01:
So it kind of reminds me of the idea of beginning with the end in mind, the outcome that we're trying to create for students. And I know it's easy to get swept up in the details of implementation and the logistics of implementation, but I really appreciate your point that often it can look 25 different ways if you have 25 different people within a school implementing it. So within that program evaluation tool that you mentioned that's in the book, what are some of the key questions that you focus people's attention on? So you mentioned, you know, what is this program supposed to accomplish? And then let's talk a little bit more about the kinds of data that are useful to collect, because often we might look at the techniques that people are using as instructional leaders. We might go into classrooms and try to observe, collect evidence that people are implementing some of the strategies.
[05:32]
But at the program level, there's more happening behind the scenes. There's more that we need to be paying attention to organization-wise that can slip by unnoticed if we're not really looking for it. So what are some pointers that you have for us as to what to look for in evaluating a program?
[05:47] SPEAKER_00:
Yeah. You know what, Justin? I start out with the needs assessment. I start out with the baseline to find out what the data are showing us about the need for the program or process. And that's the same kind of data we'll be looking at for the results as well. That kind of data, I would be looking at the demographics.
[06:05]
You know, who do we have as students? How has that population changed? You're starting to see the needs of the people in the program. I would also look at perceptions of students, staff, parents. I would look at student learning results, and I would also look at processes. What are we doing to help these students learn?
[06:24]
Very often, evaluation means looking at student learning results only. If we only look at student learning results, the kind of program that results from that data usually is about an intervention or an after-school program, a way to fix the kids. If we're looking at all of our data, all of a sudden we can see the system that the student is learning in. We understand what they're thinking, who they are, and what we're doing to get those results and the impact of what we're doing. Then we would, of course, look at the purpose of the program. We find that we can see all staff going to the same professional development and come back and implement in a different way.
[07:07]
Unless somebody says this is the purpose of that program for this particular school. And these are the outcomes. And when it comes to outcomes. when we use program evaluation, I like to think of those outcomes as huge, not just very tiny little outcomes. I want to see a 2% increase. That's not enough.
[07:28]
I want to see that if we're really meeting the needs of the students with this strategy for math, not only will their math scores improve, their attendance will improve. They're going to be more engaged. The behavior is going to improve. And I think we've got to think about all of the things that could possibly happen if we really implement that program the way that we're intending to implement that program. And then we have a chance of seeing all of those results. If you don't lay out all those outcomes at the beginning, there's no way you're going to ever see all those extra results or the huge benefits of that program.
[08:06] SPEAKER_01:
Yeah. And starting with that vision and really getting people excited about that vision and clear on what it's supposed to accomplish. And I think that's so true that we lose sight of what specifically we're trying to accomplish in the big picture, because we do get so bogged down in some of those details. Let's talk more about process, because I know a lot of the evaluation process that you talk about is about looking at processes, and in some cases, processes that are a little bit hard to measure. What are some of those hard to measure processes that schools might need to look at and how do they go about looking at them?
[08:38] SPEAKER_00:
Well, a lot of schools think that you can't measure processes because they're not qualifiable or quantifiable, but actually everything is measurable. And those hard to measure processes would include curriculum, instruction, assessment, sometimes even the environment. And basically to evaluate those kinds of processes, you just need to dig in and start thinking very logically about what is it you want to see. So with curriculum, what I want to see, I want to see it building from grade to grade to grade to grade. I want to see a continuum of learning that makes sense for the students. I want to know that it is aligned to standards and so forth.
[09:21]
And so how would we measure it? Well, I would be looking for curriculum map. I'd look at student learning results also to see if the standards are being implemented at every grade level and if students are achieving those standards for that next level and so forth. It can be very logical. And it might not be a number, but it could be something that we've set up to ensure the implementation.
[09:47] SPEAKER_01:
So Dr. Bernhardt, this is, I believe, your 22nd book, is that correct?
[09:51] SPEAKER_00:
Yes.
[09:51] SPEAKER_01:
I have to say, there must have been a pattern that you were seeing in the schools that you were working to support, the schools that you were studying through your research. What were some of the most glaring needs that you saw that told you this was the next book you needed to write? After 21 other books, getting to this topic of measuring what we do in schools and program evaluation, what was it that really jumped out at you from your work in the field that spoke to the need for this book?
[10:15] SPEAKER_00:
That's an interesting question, because even though this is my 22nd book, I thought this topic would be the first book I ever wrote. It would be about program evaluation. Over time, I've been working with schools on looking at all of their data so that they could build a system that makes sense for the students they have and so that they could be nimble and adjust along the way. so that they're continually meeting those needs. Unfortunately, the one thing that over time has still kind of perplexed me is that schools still want to only look at student learning results. Now they're getting measured on that, but sometimes in some schools, like 86% of what needs to change, you can see in the demographic data.
[11:01]
Now I use very extensive demographic data, but the other types of data as well are extremely important for helping you see how you're getting your current results. And it's very obvious what needs to change to get different results. I'm a little disappointed that there aren't more systems set up to regularly review all of the data in schools.
[11:27] SPEAKER_01:
And I know we've become accustomed to reporting on data, you know, annual test scores based on, you know, different demographic groups of students. Certainly there's so much more to pay attention to, to kind of assess the overall health of the organization and how students are doing. What are some other big data sources that we might not think of? Maybe we're not required to compile a particular report on a particular type of data, but what have you found to be fruitful for analysis that might get overlooked?
[11:54] SPEAKER_00:
Well, you know, I would systematically build a data profile for each school, even just looking in the demographics. When you start looking at who the students are by gender, ethnicity. Over time, you start seeing that maybe the population is changing, but maybe our processes aren't. And also the number of kids in special ed, the number of kids with behavior issues, the number of kids that are gifted in honors and so forth. One of the things that staff think about demographic data is that demographics don't change. But actually, when you look at all that demographic data, you can see that it does change.
[12:34]
And it often changes when the leader changes, when the principal changes. And so what that is, is a philosophy. You can see who's assigned to special ed. You can see who's allowed to be gifted. You can just see how we treat kids in the demographic piece. And that blows away a lot of people.
[12:54]
The other data that is often not looked at, and that's the process data. And so What I have staff do is just make a list of the different programs or processes that they have that are curriculum related instruction, assessment, environment, and then just even getting them to agree on what that is and then looking at the duplication. So when they make this list on like columned paper, they say, well, how do you evaluate it? And then I said, well, don't you start evaluating it by looking at it? Like, for example, differentiated instruction. So let me ask you this.
[13:33]
Is everybody implementing it in the same way? And they'll say no. So then I have them highlight in green those programs and processes that everybody understands in the same way and that are implementing in the same way. And in yellow, those things that are important to the vision, but not everybody's implementing in the same way. The things that are pink, could be the things that we need to discuss and possibly eliminate. And then red, we really need to delete this program.
[14:05]
And the first time a staff does that, the first question they have is, is there any school that has all green programs and processes? And even in some cases, they ask, is there a school that has any green? And green would be, this is important to our vision and everybody's implementing it in the same way. And so then they know that one of the things that would really impact their system would be if everybody would understand the programs and processes in the same way and implement them in the same way. And then the other type of data would be the perceptual data. If you see that your math scores are not improving, if you're only looking at student learning results, you might add remediation or interventions.
[14:50]
If you look at all the other type of data, including perceptions, you might find that teachers do not feel comfortable teaching math and they're not really qualified to teach math from their perspective. Students don't like the way they're being taught. You know, you start looking at the solution a little bit differently. How we improve would take a very different stand.
[15:14] SPEAKER_01:
I always come across this idea of implementation as just something that we do. We need to do this program, we need to do that program, we need to do the new curriculum, we need to do the training on it. And I think often the missing piece in a lot of schools is the opportunity to really learn and to inquire and to say, how does this interface with what we're already doing? How does it relate to and change what we're already doing? One kind of interesting example from our math curriculum adoption is this idea that we needed to let go of some paradigms that were important in our old curriculum but conflicted with our new curriculum we had a traditional math curriculum at the elementary school and then we switched to kind of a spiral curriculum where topics were introduced repeatedly over time and you know kind of layered over time and one of the challenges that we ran into right away was this idea of teaching for mastery you know teachers you know if you grew up in the 80s you knew to teach for mastery right you you were supposed to teach for mastery
[16:11]
And we discovered that we had a huge pacing problem with the new curriculum because we were still teaching to mastery. And we had to really sit down as a staff and say, wait a minute, are we teaching to mastery when we introduce this topic here and then we touch on it again nine weeks later? No. We're not. So we've got to let go of that paradigm of teaching for mastery if we're really going to be successful implementing this new curriculum the way it was designed. Because if we hold on to that old paradigm, we're never going to get through the year.
[16:38]
We're going to get stuck on unit two by the end of the year. So things like that, that perceptual data, I really appreciate the mention of that there because often we kind of gloss over teachers' questions, teachers' challenges. If there is a conflict between different things that we're asking teachers to do, often we kind of suppress that as leaders. We try to say, no, no, you can make it work. You can make it happen. And I really think that kind of perceptual data is so critical for success.
[17:04] SPEAKER_00:
Yeah, I think that perceptual data is important. But I also believe that as a staff, we have to agree on definitions like mastery, agree on purpose, and then what it will look like when it's being implemented. I think that's really, really important. I have evaluated response to intervention, which is probably one of the most complex systems that you can put into a school. And if the purpose isn't crystal clear and what it would look like when it's implemented isn't crystal clear, it is so complex that staff will go back and start thinking what they're doing already is going to meet the need of this program or it's close enough. And then you start seeing the results that look like what the way they would define it.
[17:55]
Like in one grade level, we could see that this grade level talked about how they saw they were going to implement response to intervention. And it was about eliminating those way below basic kids. And the results showed that in another grade level, they talked about RTI being, let's move those kids that are just on the bubble up to proficiency. And their results showed that. And then in another grade level, it's like, well, RTI is about moving all students up and you could see that distribution move from every point along the scale. And, you know, so it was like by grade level, they made their definition and that definition was obvious in their results.
[18:40]
So as a school, If we could say, no, the purpose is to meet the needs of every single student. And this is the way we're going to do it, you know, and then we come back and talk about it. But we have to reach every kid no matter where they are.
[18:53] SPEAKER_01:
Some of those things, you know, it may seem like we're on the same page. It may seem like we know what we're all talking about. But I really appreciate that point that we've got to look at our definitions that are in use and see if we can get on the same page in basic terms of what we mean by those core concepts. I wanna read the quote that you have in chapter one of the book from W. Edwards Deming, father of the quality improvement movement. And this is something that I've quoted often in my courses and in webinars on improvement.
[19:22]
You quote him as saying, as learning organizations, schools are perfectly designed to get the results they are getting now. If schools want different results, they must measure and then change their processes to create the results they really want. Love that quote.
[19:40] SPEAKER_00:
I did, too, because it's true. You know, you're perfectly designed to get the results that you're getting now. It's like, whoa, yeah, I didn't think about that. You know, schools have a theory of learning. This is our best guess on how we can get every single student proficient, because I don't think any school is going to say we're going to try to get 20 percent proficient or 80 percent proficient. They're saying we put this curriculum together, the instruction, the assessment and our environment.
[20:09]
together in this way, because we want 100% of our students proficient. And then our challenge is to go back and say, you know, this is what we implemented. And these are the results that we're getting with that implementation. What do we need to tweak to get better results? Because 80% of what needs to change is us.
[20:29] SPEAKER_01:
Well, Vicki, I really appreciate your perspective on those shared definitions and understanding our systems and understanding what we're trying to accomplish as a school on behalf of our students. If you could wave a magic wand and get all of us, everyone who is in the school leadership profession, to do something in particular, what would that be? What change would you make in our profession if you could wave a magic wand?
[20:52] SPEAKER_00:
Well, if I could wave a magic wand, I would make sure that every single school had a shared vision that drives everything that it does. And that vision would not just be a statement. I have never yet seen a statement get implemented. A true shared vision, I believe, has to spell out the agreed upon curriculum, instruction, assessment, and environmental factors that we believe will impact teaching and learning for our school students. It's not just a statement that we agree on or put on our letterhead or bumper stickers. This is something, it's like, this is our theory of learning for kids.
[21:30]
This is what we're going to do to get all kids proficient. And then every program or process within that vision would be spelled out in the way that we were talking about with the purpose, the way it should be implemented, the outcomes that we anticipate. And the way it should be implemented. And if we can spell that out, we can monitor the implementation and then we can evaluate it to ensure that we really are getting the results for all students.
[22:01] SPEAKER_01:
So I know your organization, Education for the Future, does a lot of work with school systems, with schools around the country and around the world. So, Vicki, if people want to get in touch with you and learn more about your work at Education for the Future, where's the best place for them to find you online?
[22:17] SPEAKER_00:
The best place to find us is our website, edforthefuture.com. You can sign up for webinars, get on our newsletter mailing list. Contact us, see what we're doing, where we're doing it, and just step in and ask any questions or say hello.
[22:36] SPEAKER_01:
So again, the book is Measuring What We Do in Schools, How to Know If What We Are Doing is Making a Difference. Dr. Victoria Bernhardt, thank you so much for joining me on Principal Center Radio.
[22:48] SPEAKER_00:
Thank you so much, Justin. My honor.
[22:50] SPEAKER_01:
And now, Justin Bader on high-performance instructional leadership. So high-performance instructional leaders, what did you take away from my conversation with Dr. Victoria Bernhardt? One of the big things that stands out to me is the importance of knowing what we're trying to accomplish, knowing what system we're trying to put in place for our students. As Dr. Bernhardt said, if you have one goal, one program being implemented by, you know, say 50 different people, you may actually have 50 different programs, 50 different models, because everyone interprets things in their own way.
[23:30]
So one of the first things we need to get clear on is what do we expect to happen? And again, I think we tend to view initiatives as things to just do, to check off, and then to move on to the next thing. But I want to encourage you to look at your initiatives, look at your programs as opportunities for continued learning, learning at the organizational level. And we have a program called the Organizational Learning Intensive that'll actually walk you through some tools and processes for studying what you're doing as a school and asking yourself those Deming questions. Is the system that we have giving us results that we wanna have? Because the system that we do have is perfectly designed to give us the results that we're actually getting.
[24:15]
And if those results are not the results we want, We've got to understand the system that we have before we can change it. So you can learn more about the Organizational Learning Intensive at principalscenter.com slash intensive. This is a program that is designed for central office leaders, for heads of school, for principals with leadership teams. So check that out. I don't know when you're listening to this, but we will have enrollment opening up at some point.
[24:42]
And if enrollment is not open right now, you can get on the waiting list for the Organizational Learning Intensive. And again, I want to encourage you to check out Dr. Bernhardt's work. Again, she's got 22 books on topics related to this. And the latest is Measuring What We Do in Schools, How to Know If What We're Doing is Making a Difference. And you can check that out at principalcenter.com slash radio.
[25:05] Announcer:
Thanks for listening to Principal Center Radio. For more great episodes, subscribe on our website at principalcenter.com slash radio.