[00:01] SPEAKER_02:
Welcome to Principal Center Radio, bringing you the best in professional practice.
[00:06] Announcer:
Here's your host, director of the Principal Center and champion of high-performance instructional leadership, Justin Bader. Welcome, everyone, to Principal Center Radio.
[00:15] SPEAKER_01:
I'm Justin Bader, and I'm thrilled to be joined today by Scott Genzer. Scott is president and CEO of Genzer Education Consulting, a firm that helps K-12 schools analyze their data and mine it for insights.
[00:29] Announcer:
And now, our feature presentation.
[00:31] SPEAKER_01:
Scott, welcome to Principal Center Radio. Thank you, Justin. It's a pleasure to be here. So, Scott, I have to ask, how did you get to be the education data mining guy? What happened in your career that brought you to that point where this was your full-time work to help schools look at their data?
[00:47] SPEAKER_00:
Well, it's been a long process. I graduated originally as an engineer many years ago and went immediately into education and became a math teacher and a science teacher. eventually became head of department and got into administration and became a school principal for quite a while. And while I was a school principal, I noticed there was a lot of data coming across my desk. And as a math teacher and as an engineer, I naturally started to analyze it and found fascinating insights about my students in my school. And in 2013, when I decided to change careers, I realized that this was a very, very, very keen interest of many school administrators, and that launched my consulting business.
[01:27] SPEAKER_01:
Well, Scott, I know many principals feel that sense that we're kind of swimming in data. We have data coming to us all the time, and we're often not sure what to do with it once we have it. What do you see as some of the key opportunities that data provides to us to be kind of that genie in a bottle that we can ask things of?
[01:47] SPEAKER_00:
I think the most important thing that principals can do with their educational data is help kids learn and provide a more personalized learning experience for children. Unfortunately, the current uses of school data are not for that purpose. They're used for regulation purposes. They're used for government purposes. Unfortunately, they're even used to fire teachers. But I think the best use of data is to identify kids who are doing well and challenge them further, identify kids that are doing poorly, and ideally, identify weaknesses and strengths in your school program before they become issues on a principal's desk.
[02:23] SPEAKER_01:
Right. And it's that idea of kind of triangulation between quantitative and qualitative data. And if I hear what you're saying correctly, if there is a problem, it will eventually come to us in a qualitative way. And if we can get ahead of that with some quantitative data before people start banging on the door and saying, hey, Justin, this is a problem. If we can kind of see that coming, then we'll be in a much stronger position to respond with a more proactive stance. What are some of the things that you've seen schools do in response to data?
[02:54]
Or do you have kind of an example that comes to mind of... Maybe some data that a school wouldn't typically pay attention to or wouldn't know what to look for that ended up with some productive decision making and some productive action as a result of the insights from that data?
[03:10] SPEAKER_00:
Absolutely. One story I like to tell is a school I was working with that asked me to look at their reading data. They had heard anecdotally across the principal's desk and through the parents that everybody seemed to be concerned about readings. So I did a longitudinal analysis of all the reading data and I triangulated all the data, external tests, teacher grades, teacher comments. I took all those data and looked at it and realized, first of all, that there did seem to be a reading problem. Then I digged a little further and I noticed that it happened to be by grade level.
[03:43]
There was a gap that was growing and that gap was by gender. And so once I triangulated gender in addition to grades and external testing, I noticed that the gap was widest in the high school and then it narrowed down to middle school and narrowed down. And I noticed that the real source would seem to be the source of the reading quote unquote issues seem to be in fifth grade. I was able to actually narrow it down to fifth grade by gender. And before fifth grade, there were no reading issues that I could see from the data. And at that point, And correlation does not mean causation.
[04:19]
So at that point, I go back to the school and work collaboratively with them and say, hey, why don't you poke around fifth grade and see what's going on there? Well, sure enough, the superintendent walked into a series of fifth grade classrooms. And unbeknownst to anyone, in elementary schools, I'm sure you're aware that there are lots of little mini libraries in all of these classrooms. And that's where the kids get their books from. And it turns out, just unbeknownst to anyone, almost all of those books were very girl-centric. They were female teachers and they just happened to be ordering books.
[04:50]
Most teachers have complete autonomy in what they order. And I don't think it was anybody's fault. And it was nothing that anybody had noticed. But we think that it's quite likely that just student interest wise that these girls were grabbing books and diving into it and it was affecting their reading scores. We don't know this for a fact. But let me tell you, that superintendent went out and gave a little bit of money to the fifth grade and said, can you please buy some books about trains and cars?
[05:17]
And we're going to go and now track that and see if that actually was the cause of this issue or maybe a part of this cause. We never know for sure in data mining. All we do is see patterns. Sometimes those patterns...
[05:28]
go nowhere. But sometimes we can really affect student learning with very simple things like buying books on trends.
[05:34] SPEAKER_01:
Well, Scott, I love that example that you just shared. And I feel like it's a great example of leadership being guided by data, but not kind of pushed along by data. I wonder if you see that issue playing out where rather than just getting data and being forced to do something as a result of it, leaders can actually use that data to inform their own decision-making, to guide them as leaders without putting the cart before the horse. How do you see that playing out in schools?
[06:03] SPEAKER_00:
I see this played out all the time in schools, particularly in the United States, but in other countries as well. It's very unfortunate that school leaders are being forced to look at data in a very reactive way and being forced to be driven by data results. Their statewide tests are below a certain threshold and therefore they are forced to act on it. That causes a whole vicious cycle of teaching to these statewide tests, putting aside arts curriculum, putting aside all these horrible effects that happen. To me, that whole thing needs to be turned around. The questions that need to be asked need to come from the teachers, from the schools, and then use the data to inform student level and school level decision making.
[06:47]
The questions should always be coming from the educators, not from the data. And that's really the key difference between using data, as I say, as a force for good, rather than a force for evil.
[06:59] SPEAKER_01:
Well, Scott, I know we have a lot of different types of data that we can look at, both in terms of overall school performance, kind of summative student performance, student growth over time. What do you see as some of the most important types of data for us to look at, particularly in terms of individual students?
[07:18] SPEAKER_00:
I think the most important forms of data are the ones that show growth over time rather than absolute measures. Children come in lots of different shapes and sizes. and they're all wonderful, and they all have various strengths and weaknesses. What schools should do, in my opinion, as a teacher more than anything else, is I always wanted to make sure my kids grew learning-wise. I was a math teacher, and I had all these students, and I wanted to make sure that at the end of the school year that they learned more math, that if they were a super math kid, that they learned even more, and if they struggled, that they grew as well. So growth measures, in my opinion, are far more important than absolute measures.
[07:57]
And there are ways to do that, both with internal data in a school, looking at teacher grades, teacher comments, using text mining, and looking at external measures. For example, one particular external assessment that I like is the NWA measures of academic progress that actually gives you measures of growth, normative growth measures, which I think is far more important than normative percentile measures.
[08:22] SPEAKER_01:
Yeah, I was going to ask about that because I'm familiar with the MAP assessment, Measures of Academic Progress, and the RIT scores. I know we're getting into a lot of specific acronyms here, but that idea of growth being something that we want to measure apart from maybe the curriculum-based assessments or apart from the other standardized assessments that students are taking, just to have kind of a sense of how much progress students are making. And those are very difficult things to determine. And I know with MAP in particular, there's some controversy about MAP because it is quite technical. the way it works, and it's not necessarily curriculum linked in the way that, you know, a chapter test is. Talk to us about that relationship between, say, a curriculum-based assessment.
[09:10]
Let's say I'm teaching a unit, and I'm giving assessments that are, you know, modified from what the publisher of my, say, my math curriculum provides, and I'm also giving the map assessment, which gives us this kind of bizarre score, and we have these growth points, and it's a little bit outside of what we normally deal with on a day-to-day basis in the classroom. Could you talk to us about how those work together?
[09:33] SPEAKER_00:
Yes, absolutely. And I would say this is probably the number one complaint that I think teachers have with external assessments. I don't think most teachers actually have a problem with external assessments, except for really two things. They take up class time, and they don't necessarily link with what the teachers are doing in the classroom. And to me, this really boils down to this difference between accuracy and precision of measuring learning and let me just be very specific about this standardized test map is a great example and all the standardized tests are in this category are very precise measurement instruments you can think when you go to the doctor's office they can measure your blood counts the nearest microliter and so on they're very precise instruments but they're not necessarily accurate they're not necessarily measuring what the teachers doing in the classroom on on the other side of that coin you have teachers that are giving that are measuring kids learning all the time and they're actually giving assessments
[10:25]
that are measuring exactly what the child is supposed to be learning. So they're very accurate, but they're not precise. They're not well designed. The assessments that teachers generally use are not precise instruments. Their results can vary tremendously. And grading, internal grading particularly, is very subjective.
[10:44]
So it's this, you need both, in my opinion. You have external tests, which are very precise, but not accurate. And you have internal measures that are accurate, but not precise. A lot of what I do is work with schools, and I go in there and help teachers improve their internal measurements to make them more precise so outsiders don't have to rely so much on standardized tests. That's why stakeholders look at standardized tests, because they don't trust the internal measures themselves. that teachers are giving.
[11:14] SPEAKER_01:
I believe that. And as a principal, I can say, you know, we always have these kind of hunches about different teachers and kind of think, all right, I see this person working just incredibly hard and doing these really innovative things and, you know, just incredible things happening in the classroom. You know, but, you know, a certain type of test isn't going to show a big difference with that. And when we started to get those map scores, and I would see in a particular classroom, we'd have, you know, 13 points worth of growth from one test to the next, which is quite a lot. That was very validating and very illuminating in a lot of ways, because I could look at that and say, this was something that we weren't going to get any other way, simply because the measurement is so difficult. So I wanted to ask a question now about the kind of cycle that you recommend and that you actually help schools with on kind of a continual basis or on a monthly basis to look at their data and make sure that they're getting the insights that they should be getting from the data that they're already going to the trouble to collect.
[12:20]
What do you do with schools as kind of an external partner to help them get more out of their data?
[12:26] SPEAKER_00:
Yeah, I work with a variety of schools. Most of them are small schools, the ones that really can't afford to have a full-time data coordinator or an assistant superintendent in charge of data. Big districts have these people in place. I work with a colleague in Oak Ridge, Tennessee, and that's his full-time job. So I generally work with smaller schools that don't have the resources to have somebody with my training or my colleagues training in Oak Ridge. The cycle is really very, very iterative.
[12:51]
It's There's a standard onboarding process where we have to literally dig through file cabinets and old hard drives and goodness knows what to go and find all the data we possibly can. And then we load it into a system, which we have very good tools now, technology-wise. That's not the hard part. The hard part is finding good questions. And that really starts this long, iterative process where I form a relationship with schools and I ask, what are you looking at? I had a phone call yesterday with a school that I've just started working with And they said, you know what, we're getting all this anecdotal feedback from parents about math and science.
[13:27]
They are very concerned about this. So can you look into math and science? And so I'm going to do a general data mining where I simply throw everything against the wall and see what sticks. And I'm going to go and throw back to them some correlations I find. They're going to look at those correlations and say, ah, some of those are really interesting. Some of those don't interest us at all.
[13:46]
Some of those do match what we're hearing from the ether here. Can you dig a little deeper? Then I dig a little deeper. I do more data mining. I throw it back to them. So there's this back and forth cycle then that begins.
[13:58]
And we work together helping understand the school, understanding our kids, and we continually add data to the cycle. So to me, that's the right formula is this collaborative process between the data and helping administrators, helping teachers help kids.
[14:16] SPEAKER_01:
It really sounds like an inquiry cycle or like a continuous learning cycle.
[14:21] SPEAKER_00:
Exactly. It's no different, really, than what happens in a classroom. I think all of us educators are teachers at the end of the day. I feel like I'm still a teacher. But I think the wonderful thing about this is that it is this...
[14:37]
collaborative process. And it's not some top-down thing that says, listen, your students are not meeting this particular bar. You must react to it.
[14:46] SPEAKER_01:
Well, Scott, if people want to find more about what you do to help schools get more out of their data, where can they find you online?
[14:52] SPEAKER_00:
My website is genzerconsulting.com, G-E-N-Z-E-R consulting.com. And there's information there. Or they can just email me directly. My email address is scott at genserconsulting.com.
[15:04]
Fabulous.
[15:04] SPEAKER_01:
Thanks so much. And thanks for joining us on Principal Center Radio. Thank you, Justin.
[15:08] SPEAKER_02:
It's been a pleasure to be here. And now, Justin Bader on high-performance instructional leadership.
[15:14] SPEAKER_01:
So, high-performance instructional leaders, what did you take away from Scott's comments about how we use data in schools? I really appreciated the way that Scott emphasized seeing data as something not that drives our decisions, but that informs our decisions. And I think all too often... We get data and we receive it kind of passively and we just say, OK, well, here's what the data is telling us to do.
[15:38]
When often what the data is telling us to do depends on what questions we're asking of the data. So I believe high performance instructional leadership involves asking good questions and then either interrogating the data that we have or going out and seeking the data that we need. to answer those questions in productive ways, to give ourselves rich data and realizing that sometimes the richest type of data isn't quantitative at all. We don't necessarily need to give another assessment to learn more about, say, how our students are writing. We might just need to look more closely at the data that we already have. Now, one thing I want to mention as kind of a growth edge for a lot of us as instructional leaders is that data is technical.
[16:23]
We have to deal with spreadsheets and reports and statistical concepts. And there is absolutely nothing wrong with seeking out the expertise that you have in your math department, especially if you're at the high school level or in your central office or if you're in a smaller school that doesn't have a central office. Seeking out someone like Scott to help with the more technical aspects of this, because as leaders, we do need to keep a perspective at a higher level. And we don't always have time to learn every in and out of software like Excel or data analysis software. So there is nothing wrong with going where the expertise is and bringing those leadership questions, bringing those school level decision making questions and bringing those inquiry processes forward. to a support provider and getting some assistance with that so that we can go through a useful learning cycle and make the changes that we need to make to serve our students.
[17:16] Announcer:
Thanks for listening to Principal Center Radio. For more great episodes, subscribe on our website at principalcenter.com radio.