The Illustrated Guide to Visible Learning

The Illustrated Guide to Visible Learning

About the Author

Dr. John Almarode is a bestselling author and an Associate Professor of Education at James Madison University, where he holds the Sarah Miller Luck Endowed Professorship. He received an Outstanding Faculty Award from the State Council for Higher Education in Virginia in 2021. Before his academic career, John started as a mathematics and science teacher in Augusta County, Virginia. He is the author, often with John Hattie, Doug Fisher, and Nancy Frey, of more than 29 books.

This episode of Principal Center Radio is sponsored by IXL, the most widely used online learning and teaching platform for K-12.

Discover the power of data-driven instruction in your school with IXL—it gives you everything you need to maximize learning, from a comprehensive curriculum to meaningful school-wide data.

Visit IXL.com/center to lead your school towards data-driven excellence today.   

Full Transcript

[00:01] Announcer:

Welcome to Principal Center Radio, helping you build capacity for instructional leadership. Here's your host, Director of the Principal Center, Dr. Justin Bader. Welcome, everyone, to Principal Center Radio.

[00:13] SPEAKER_00:

I'm your host, Justin Bader, and I'm honored to welcome to the program Dr. John Almarod. John is a best-selling author and an Associate Professor of Education at James Madison University, where he holds the Sarah Miller Luck Endowed Professorship. He received an Outstanding Faculty Award from the State Council for Higher Education in Virginia in 2021. And before his academic career, he was a math and science teacher in Augusta County, Virginia. And he is today the author, often with John Hattie, Doug Fisher, and Nancy Fry, of more than 29 books, including their latest, The Illustrated Guide to Visible Learning.

[00:48] Announcer:

And now, our feature presentation.

[00:50] SPEAKER_00:

John Elmerode, welcome to Principal Center Radio. Hey, thanks for having me.

[00:54] SPEAKER_01:

I am thrilled to be here.

[00:56] SPEAKER_00:

I'm excited to talk about the book because this is the latest in a series on visible learning that's intended to, I think, really build our capacity as a profession to understand so much of the research that is out there that often we don't fully act on. And I know this is a visual guide, an illustrated guide to visible learning. So I wonder if we could start just by talking a little bit about the idea of visible learning? Because I think we've all heard of John Hattie's work. We've heard of meta-analysis and effect sizes, but take us into the basic idea of visible learning. Yeah, it's a great question.

[01:28] SPEAKER_01:

Although the title itself offers a hint about the idea behind visible learning, John Hattie started it out as a hobby. And what we do now collaboratively within the team is strive to translate the incredibly large body of research on what works in teaching and learning into into something that teachers can use in their classroom on Wednesday or Thursday? In other words, how do we move from research to reality, from intention to integration, from potential to powerful practice? And so meta-analysis seems like a pretty good place to start because it's a study of studies. And so it collects it all together. So as a classroom teacher, I don't really have to read every study there is on homework or every study there is on phonics instruction or every study there is on fill in the blank of whatever influence.

[02:18]

And so pulling together meta-analysis gives us this big view. but and i'm going to answer your question i just have to come in and make a soft landing with this how do you make sense of the research that's the secret so it's not having research it's how do we make sense of research so that we can utilize it and so what's fascinating is when you pile all the studies together that now make up the visible learning database i mean we're talking upwards of 3 000 meta analyses 100 000 studies i mean it's just an incredibly large body of research There is a message that comes through, and the message is this. Learning is most effective when teachers see learning through the eyes of their students and students see themselves as the drivers of their own learning. That's the bumper sticker phrase that sums up all the findings.

[03:11]

Well, what does that mean? Well, quite simply, it means when we make thinking and learning visible in the classroom, then we can see it and our learners can see it. And it allows us to make better decisions as a teacher and makes or allows learners to make better decisions as learners. That's the part one of that phrase. The second part or part two of that phrase is how do we build capacity in our learners? to drive their own learning because soon in the not too distant future, we won't be their teachers anymore.

[03:42]

So do students know what to do when they don't know what to do and I'm not their teacher anymore. And so all of the findings seem to scream that message. And so John then captured it into our slogan and what we say all the time, when we see learning through the eyes of our students and students see themselves as the drivers of their own learning. That's what it means to make teaching and learning visible. That is visible learning.

[04:08] SPEAKER_00:

Well said. Thank you for that kind of summary and introduction. And one other topic that I wanted to bring up early on is the idea of effect sizes, which can be positive, they can be negative. And often there are these kind of awkward decimals that often we're not really...

[04:22]

quite sure how to interpret. So take us into that idea of effect sizes. That's kind of a through line to all those meta-analyses and a lot of the practices that get a number attached to them. That number is the effect size typically, right?

[04:34] SPEAKER_01:

And at the fear of oversimplifying. And so what I'm going to do is a very simplified approach to effect sizes because there are those that devote their entire careers to the statistical analysis and study of effect sizes. But essentially, for the purposes of our conversation, we can say that effect sizes measure change, change between time one and time two. So we measure learning at some time in history or at some time in the year, and then we measure it again and we look at the difference. It can be the difference between two interventions. We did this intervention here and we did this intervention there.

[05:10]

And this is the difference between the learning growth with those two interventions. We can do groups. We did something with this group and something with that group. And this is the difference. And so effect sizes show growth or difference. Growth in terms of time, but difference in terms of interventions and groups.

[05:28]

And what it allows us to do is it allows us to make relative comparisons to other things tested at different times with different interventions in different groups. I'm always nervous about this next statement because it's kind of true, but it's a bit of an oversimplification. Effect sizes are based on standard deviations. And so what that means is it allows us to take an apple and an orange and compare it. Because oftentimes, and you know, if you're listening to this podcast and you know as well, sometimes in education, we compare apples and oranges. We compare two things together that are in completely different contexts with completely different learners.

[06:07]

And that can get risky because our classrooms do have differences between them and amongst them. What an effect size does is it allows us to mathematically compare apples and oranges and and make relative comparisons across influences, interventions, and groups. And so Cohen's d is the effect size calculator that is used in visible learning. It's one of the ways to calculate effect size, but it's growth or difference.

[06:35] SPEAKER_00:

There's a magic number that indicates the kind of the tipping point between something that is, you know, a quite good practice and something that's either, you know, kind of neutral or actually even harmful. Explain that number for us a bit.

[06:47] SPEAKER_01:

So we call it the hinge point in the visible learning work. Our team refers to it as the hinge point. Let me set it up before I throw the hinge point out there, because I think it's important and empowering to think about the hinge point this way for the longest time. we used zero as the hinge point. In other words, anything that we did in our schools and classrooms, if it was positive, we celebrated it. And if it was negative, we stayed away from it.

[07:13]

And what the research has uncovered, the Visible Learning Database is one of the big contributors to this, is that about 95 to 97% of everything we do is positive. And there's only this small subset of influences that are negative, and we know what they are, bullying. Let's see, boredom is negative. Television is negative. Drugs, negative. And so, okay, that's fine.

[07:38]

But all of these other things are positive. So maybe our threshold is too low because if I'm a learner and I spend 180 days in a classroom, then I should likely expect one year's worth of growth, not just anything positive. And what we found is that teachers, well, zero is too low a threshold because the impact teachers have on learners is far beyond just positive. It can be quantified into a year's worth of growth. So number one, zero is too low of a threshold. It doesn't represent the value added by teachers, the quality that teachers offer to learners.

[08:19]

So when you put in all the influences from visible learning, it turns out the average effect size of the influences is four tenths of a standard deviation .4. And so in the visible learning work, what John and the rest of the team will say is .4 is the average effect size of everything in the database. And so anything above .4 has the potential to accelerate learning.

[08:46]

Anything below 0.4, it still moves learning forward, but not as much as those above 0.4. And then, of course, 0 and negative still do not accelerate learning. So that hinge point is 0.4.

[08:58]

It's the average effect size of everything in the database. So then we can talk about influences as being above average. or below average, and then we can also talk about those that are negative.

[09:09] SPEAKER_00:

So it is the hinge point for that relative comparison. Having that hinge point allows us to compare what we're doing to kind of default practices or just students maturing as time passes, as the school year unfolds, with the idea that we can kind of stack the above 0.4 practices or the best practices that we could and avoid the negative practices entirely. Maybe do less of the below 0.4 practices if we have a better alternative and get dramatically better results. Is that how it works?

[09:44]

Is that a real thing that we can do? Yes and no.

[09:49] SPEAKER_01:

Certainly, there are those effect sizes that tell us maybe you should look here in terms of interventions, approaches or strategies or influences. But it doesn't mean we wipe out anything that's below point four. What it does, though, is it gives us information on the potential of that particular influence. to move learning forward. And so the question then, is it worth the time, investment and energy to do that? Or is there something else I can do that has the potential to move learning forward at a much higher rate?

[10:21]

And so the message is never to go through and create a list and hang it on the wall and say, all right, in this school, we only do these top 10. No, because there are some strategies, influences and approaches that are not above average, but they're a vital part of what we do. And they're there. We just have to be aware that when it comes to accelerating student learning, they don't have the highest potential return on investment that another one does. And so yes and no. The other thing I would say, and certainly not to hijack your podcast, but you notice I've used the word potential a lot.

[10:56]

This is one of the things that we talk about in our group quite often. What gets us in trouble sometimes in classrooms is we often treat research findings as guarantees. This researcher says this strategy showed this impact. So if I go do it on Tuesday with my literacy blocker in my third block class, then I'm guaranteed to get that result. That is not only not true, but that sets us up for unproductive failure. What the research tells us is this intervention approach or strategy has been documented as having a potential influence on learning.

[11:36]

The secret sauce is then the expertise of the teacher in implementing that strategy in their own classroom and making adaptations based on that local context. So the effect sizes tell us the potential, But the expertise of the teacher in integrating it and implement it in their classroom is what turns that potential into a powerful practice.

[11:59] SPEAKER_00:

So it's not just the kind of mindless stacking or racking up of things that are on the list. It's kind of like, you know, if I wear all of my clothes at once, I'm not going to be ready for swimming and skiing and hiking and it would be purposeful, right? Exactly.

[12:15] SPEAKER_01:

That's a great way to think about it. So I will certainly quote you on that one because that's a good way to think about it. Yes. So they do not stack. And if I use six of them that are above 0.4, I can't add them up and expect my exit tickets to show an effect size of 2,336.8.

[12:33]

That's just not going to happen. Yeah, that's a great way to think about it.

[12:41] SPEAKER_00:

Yeah. And I've seen some claims, not using visible learning, but using other approaches of just, you know, just unfathomable growth, like multiple years of growth in a couple of weeks. And I think we always have to be skeptical of claims like that, that rely on some sort of, you know, some sort of stacking or some sort of extrapolation from, you know, a little bit of rapid growth. And I'm so glad that you highlighted just kind of the intelligent use of these strategies by teachers, knowing that, you you know, different practices. Sometimes we're going to do things that maybe are not the best in the world, but there's a practical reason to do them. What I wanted to ask about was lecture because lecture, I think if I recall, has a not super impressive effect size, but you know, sometimes we do need to give lectures to some extent.

[13:29]

Tell us what you found about lecture.

[13:32] SPEAKER_01:

You're going to love this. So when we update the database, because meta-analyses are produced all the time. And so every year we update the database with the latest research publications to zero in on the effect size as it is today. What the research has said, what the research does say, but not what the research will say in the future. Lectures. All right, here we go.

[13:55]

It's negative. It's negative now. It wasn't negative before, was it? Was it positive at one point? It was positive. And part of the challenge is we often make lecture and direct instruction synonyms.

[14:10]

We make those two concepts synonymous. And that's not true i mean direct instruction is a very clear model of instruction by the way it has the highest effect size of any of the models of instruction on record direct instruction anticipatory set objectives a mini lesson guided practice feedback independent practice and then you loop through but lecture in and of itself um has a negative effect size and That's because at no point in a lecture, remember the original quote from Visible Learning in the beginning, at no point during a lecture do we ever get to see learning through the eyes of the students because they're sitting there passively. And nor do we give them a chance to drive their own learning because they're sitting there taking in information. So lectures are negative. There is zero evidence out there that shows that lecture has any positive effect on learning.

[15:02]

Now that we've drilled down to definitions of lecture and can sort the studies better, we have a pretty clear picture. There's zero evidence that lectures work. What does work is maybe a three to five, five to seven minute mini lesson As long as it's followed with, back to direct instruction, guided practice, cooperative learning, feedback, independent practice. But then in that case, that's not lecture. So lecture, I would feel comfortable saying there's just, oh, this one is always dicey because I find myself sitting at a university right now as we speak. So then they might say, well, my students learn.

[15:41]

Yeah, because they leave our lectures at the university and go back to their dorms. They go back to the library. They go back and join their study groups and they use other strategies that improve their learning. And so they learn in spite of us. They learn in spite of us, which is why one of the biggest predictors in college success is the student's study skill toolkit. Because if a student doesn't know how to learn in spite of a lecture, it's going to be a rough run.

[16:08]

Those that do, they survive it.

[16:11] SPEAKER_00:

Now, correct me if I'm wrong on this, but I want to indulge my curiosity and make sure I understand this correctly. Part of the reason that lecture has likely tipped from slightly positive into negative territory, to me, it seems like part of the reason must be that we have now found better things to do in comparison. If lecture was the only strategy we had, it couldn't have a negative effect size because that would be in comparison to no alternatives. Am I thinking about that the right way?

[16:39] SPEAKER_01:

You are. And I would add another thing to the list, and that is we got better at measuring what actually was working. Because what we found out was, oh, these learners sit through lectures, but they seem to be learning a whole bunch. Oh, my gosh, we should ask them what they do outside of the lecture. And then when we control for that, we find out that's what was having the influence, not the lecture. So we get better at methodologies.

[16:59]

We get better at the research. We get better at targeting really what we're after. I mean, it's just like in medicine or engineering or biochemistry. As technology and tools and understanding advances, we get better and better at the research. And education is no exception to that. It's the difference between biting on a piece of wood for surgery and using anesthesia to block the pain.

[17:25] SPEAKER_00:

It's just growth in the field. yeah interesting and there's a moral urgency to that right there's an ethical obligation that we have as educators to use practices that are as good as they can be right like it's not ethical to do surgery without anesthesia if you have anesthesia available you know we're in an age where there's not really an excuse for not knowing what best practices are. But I wanted to ask about a problem of definition, because I know this is a problem for researchers who conduct meta-analyses. I know this is a challenge in your work in combining meta-analyses, a problem of definition. So if you say lecture or you say direct instruction or you say questioning or any concept, there's the challenge of defining that and potentially having to combine studies that maybe define those terms and those practices a little bit differently. I want to ask first how that challenge shows up in the research, on the research side, but then also how it shows up on the practitioner side, because as practitioners, one of the challenges we face is you can say, everybody go do X, go do direct instruction to 10 people, and those 10 people will all be doing completely different things, and they can't all be the same effect size.

[18:33]

So help us think about that issue, both in terms of definitions on the research side and understanding on the implementation side.

[18:39] SPEAKER_01:

I love that question because there are some criticisms about the visible learning research and some of them are rock solid criticisms. And then some of them are also misunderstandings about what is actually at our fingertips in the visible learning research. And so some of those criticisms we really don't respond to because it's somebody that has grabbed a hold of a single sentence, a single influence and just run with it. But there are some criticisms of it that are no different than any research study. No research study is perfect. No research paper is without its flaws or limitations.

[19:19]

And visual learning is, of course, no exception. One of the things that we do to counteract that, and John has done this since the very beginning, and now we do this together as a group. If you go to the Visible Learning Database, which by the way is free to the public, there are a couple of things we do to try to sort through that. Number one is we provide a definition of the influence that we are using. When we say lecture, this is the definition we're using to define lecture. And so we're very transparent in how we're defining something.

[19:51]

So if you were to look up access to mobile phones, That definition is different than using mobile phones or using handheld devices in learning. We have to distinguish those because the effect sizes are very different. One is negative access, but having it available in a learning situation is positive because I can quickly look something up. And so we are very clear in our definitions and we say list of definitions of influences. And by the way, at the bottom, there's a contact us button and people can reach out and say, hey, I don't agree with your definition and we engage in that dialogue. So we open it up to public access and discussion.

[20:31]

The second thing we do is we provide the list of all of the meta-analyses that contributed to that effect size so that anyone... listening to this podcast, you could go on there and pull them down, do the same thing we did and see if you arrive at the same effect size. And then we invite you to hit the button, contact us and engage in that dialogue. So what am I trying to say?

[20:54]

Just like any other researcher in any field, transparency about methodology and definitions is the best we can do in the sciences. It's not really that, I mean, in physics, an electron is an electron is an electron. But in our world, a third grader is not a third grader, not a third, right? So we go for transparency, we open it up to the public, and we encourage that type of interaction. When it comes to practicality, this is the fun one. And this is one of the goals of our team.

[21:24]

We are now heavily into the translation part of it. And that is, okay, so if this is the definition, this is the effect size, how do we support teachers leaders, superintendents, policymakers to interpret this and translate it. And the visual guide or the illustrated guide is one of our biggest steps forward in that. Because what we try to say is, look, it has to do with the way you think about what you're doing. In fact, that's another huge finding in the Visible Learning Database and the research. It's not what we teach or how we teach, but how we think about our teaching and our leading and our learning.

[22:01]

And so there's a particular part in that illustrated guide called evaluative thinking. And that's the type of thinking that an individual engages in when they're translating research into that reality. So to answer your question more succinctly, Part of visible learning is building a mindset or a mind frame that, okay, I want to use this intervention. Jigsaw has an effect size of 1.20. But how I translate it in my classroom is going to look fundamentally different than yours.

[22:32]

But if I think about it in the way of, all right, what are my learners ready for and what evidence tells me that? What intervention am I going to use? Jigsaw. How am I going to implement it? How am I going to evaluate to see if it worked? How am I going to collect that evidence and interpret it?

[22:47]

So building the capacity to engage in evaluative thinking so that we can make adjustments along the way as we implement something from intention to integration. And so from the practical standpoint, it's the expertise of the teacher and building that capacity to make those fine tune and gross adjustments or course adjustments that need to happen for my unique setting.

[23:12] SPEAKER_00:

That's such a better vision than what I think we often do, which is kind of collect buzzwords. And I remember that feeling as a teacher, as a principal, like you want to say the right things. You want to be on the right bandwagons, but sometimes we oversimplify that and we say, well, I'm just doing, here's the list of things I'm doing. I read them on a list. I saw them in a book. I heard them at a workshop.

[23:33]

So I'm just going to stack up as many of those buzzwords as I can. And I really appreciate the way you articulated the nuance of implementing. And I wanted to ask about the format of the book in particular, because this book is somewhat unusual among books for professionals in that it has an illustrator. Talk to us about who that illustrator is and what she did and how that shows up in the book. She's amazing.

[23:54] SPEAKER_01:

And she takes thinking and translates it into visual images along with the copy or the narrative that goes with it to help us make sense of those complex ideas. I mean, imagery has its own effect size. It's a little bit above 0.5. And so we know that images help. Imagery has a high effect size.

[24:13]

It's much better to put a picture, say a photograph from Dorothea Lange up and have students Ask questions, comment, and discuss it, 0.5. Classroom discussions, 0.82. Asking questions, 0.59.

[24:25]

I mean, so they're all different ways. So we thought, well, let's create an illustrated guide because at the end of the day, the visible learning work can be incredibly dense and overwhelming. And so the illustrated guide had two purposes. Number one, distill it down into its core ideas. And number two, make it accessible as an entry point for anyone that wanted to. Now, you can't stop there.

[24:49]

You just dig deeper where you have the greatest need or desire or interest. And so the format was to provide a visual representation that went alongside the concepts, ideas, and I guess theory behind the work. enough information to get going, but then also where to go next in terms of the direction. So it's why it's broken down into big message, the four key concepts or four key messages of visible learning, and then what we would call our signature practices, things that you would see in schools and classrooms that were truly implementing the work. One of those, of course, being evaluative thinking, which is the adapting of the work based on the local context. And so the illustrated guide, I mean, that illustrator is phenomenal and has done several of our books, the playbooks,

[25:38] SPEAKER_00:

to bring those concepts to life. Yeah. And tell us a little bit about how you work together. So Terrell Hansen is the illustrator and she, I know does a lot of kind of live note taking in workshops, right? Like she'll take notes for a group. Is that right?

[25:53] SPEAKER_01:

She will. And so at the annual visible learning conference that we are just coming off of this summer in San Diego, she was there and would visually represent many of the sessions. When we're working on books, what we do is we send her the narrative, literally. And I wish I could share my screen with you all because I would let you see it. I mean, it is literally a stream of consciousness about concepts. She draws it and then we fill in the narrative and tighten it down with our copy editors and our editor.

[26:26]

And that's how that works. Now we go back and forth because there are times she'll represent something visually and we'll be like, well, that's really not... That visual is going to confuse it or that visual doesn't go far enough or she'll do something and our response will be, oh my gosh, that's amazing. We go change the narrative to better align with her drawings.

[26:46]

And so she helps our thinking just as much as she helps bring it to life for those outside of the visible learning work. I'm jealous. She's awesome. Also, she's a great human being. I would be remiss if I didn't point out she's also an outstanding human.

[27:03] SPEAKER_00:

One thing you mentioned that I wanted to ask more about is the signature practices that are embedded in the book. Tell us a little bit more about those.

[27:09] SPEAKER_01:

So when you go to the doctor, the doctor practices medicine, a lawyer practices law. And so they have this collection of things that they do, but what they pick and how they do it depends on the individual sitting in front of them or the particular case that's being handed to them. And so what we thought would be helpful is to take the four key messages from the visible learning research and not the Seeing learning through the eyes of the students and students as drivers of their own learning, that's the theme. The four key messages are things like students should drive their own learning, culture first, learning second, know thy impact, and collective responsibility. Those are the four, I guess, big messages, the four main pillars of visible learning. But then there are all these practices that you would observe in some form in a school or classroom that truly was implementing the research.

[27:58]

So one of them is learning. belonging and classroom culture. If we're really going to implement what works best, we are going to ensure the signature practice of promoting belonging and creating positive classroom culture is in action every day. Now, what it looks like specifically depends on the individual school, but that signature practice would be somewhere. Clarity is another one. Do students know the what, the why, and the how of the learning?

[28:24]

A feedback would be there. Representing learning through three different phases, surface, deep, and transfer. So signature practices are those practices that are going to be visible each and every day in our schools and classrooms that are translating what works best into those schools and classrooms. But again, what it looks like in your environment may be fundamentally different than what it looks like in mine, but the underlying research is still the same. Those are the signature practices. There are 11 of them, and that sums up In a nice little tight package, those upwards of 3,000 meta-analyses, 100,000 studies representing over 400 million students in the Visible Learning Database.

[29:11] SPEAKER_00:

Love it. Certainly more research than we could hope to read in our lifetime. So it's valuable that we have the meta-analyses and certainly the work that you and your team have been doing over many years to make this research more accessible and now to make it quite visual as well. So the book is The Illustrated Guide to Visible Learning, An Introduction to What Works Best in Schools. John Elmerode, thank you so much for joining me on Principal Center Radio. Thank you for having me.

[29:36]

This was great. I appreciate your time.

[29:39] Announcer:

Thanks for listening to Principal Center Radio. For more great episodes, subscribe on our website at principalcenter.com slash radio.

Bring This Expertise to Your School

Interested in professional development, keynotes, or workshops? Send us a message below.

Inquire About Professional Development with Dr. Justin Baeder

We'll pass your message along to our team.