What is Giving Compass?
We connect donors to learning resources and ways to support community-led solutions. Learn more about us.
Search our Guide to Good
Start searching for your way to change the world.
Giving Compass' Take:
· Writing for The Learning Accelerator, Kevin Hoffman discusses his work as an Agency Fellow at the Harvard Center for Education Policy Research's Strategic Data Project on using blended learning data to find the most productive and worthwhile programs for students.
· How do districts choose which blended learning programs to adopt? How can schools effectively implement blended learning programs?
· Read more about blended learning and its top trends.
“For every 20 minutes a student spent [on our software], their MAP score increased by 2.5 points.”
“Ensure that students see two years of reading growth [if they use our program].”
“Our study has found that use of this math program with fidelity has roughly the positive impact of a second-year teacher versus a first-year teacher.”
In the world of managing education technology, I’ve seen quotes likes these time and again. (In these cases, respectively, in a white paper on a vendor’s site; in a solicitation email directly to a principal; and when sitting in a meeting of a research collaborative.) I can’t condemn them; in a crowded marketplace selling to an increasingly data-driven consumer, I tend to take these statements at face value. I do believe that the folks studying the impact of education technology – either for for-profit vendors or at third-party research institutions – have done so with integrity and are using reasonable data to share these results. However, I’ve seen that marketing such binary results (it works, or it doesn’t) often leads practitioners to believe that education technology tools can be a panacea – plug-and-play solutions that can solve larger problems. As educators, we must push beyond the binary lens and instead rigorously examine how these tools, alongside and in service of positive teacher practices, can be most effective in improving student outcomes.
Let’s take the last quote as an example:
“Our study has found that use of this math program with fidelity has roughly the positive impact of a second-year teacher versus a first-year teacher.”
This quote, as mentioned, was shared at a research meeting a couple of years ago. This was a study of over 200,000 students nationwide enrolled in public districts and charter organizations, and represented a remarkably positive result in support of the math program in question. And yet, my Aspire teammates and I saw almost no correlation between our own students’ success in this program and growth on our state assessment. While this data was incredibly compelling on a national scale, advocating for continued usage with these national results didn’t feel right given what we knew about the practices and the results in our very own schools.
Read the full article about blended learning by Kevin Hoffman at The Learning Accelerator.