蜜桃影视

Explore

DeArmond & Gross: It鈥檚 Time to Help Teachers Generate and Use Their Own Evidence on Digital Tools

This essay, part seven in an , previously appeared at , the Center on Reinventing Public Education鈥檚 blog at the University of Washington Bothell. Here are essays that have been previously published at 蜜桃影视:

Lake: Are We Personalizing Learning for the Students Who Need It Most?


鈥淭here are so many digital resources out there, I am lost as to which ones are good.
I usually try things that some of the more technology-knowledgeable people I teach with [use].鈥
From 鈥淭eachers Know Best,鈥 Bill & Melinda Gates Foundation, 2015, page 21

Teachers in the personalized learning (PL) schools we visit are using a wide range of digital tools 鈥 sometimes picking up and dropping them at a rapid clip 鈥 but their decisions about which tools to use generally aren鈥檛 guided by systematic evidence. Instead, teachers tend to rely on their colleagues for advice. That鈥檚 understandable, but it means that teachers have little assurance of a product鈥檚 effectiveness, and students tell us they feel like guinea pigs as teachers cycle through different tools.

Knowing that teachers will turn to their professional networks for advice, an urgent question for the field is: Are there ways to enrich those networks with more systematic evidence on the quality and impact of digital tools?

Since teachers are far less persuaded by research studies on interventions than by their colleagues鈥 own experiences with interventions, the answer probably doesn鈥檛 lie in creating a new, massive clearinghouse of products or research studies.

First, some of these clearinghouses already exist. Consumer Reports鈥搒tyle websites like EdSurge and Common Sense Education, for example, cover thousands of technology products for a wide range of subjects and grades. Though organizations can, like EdSurge, provide a 鈥渃oncierge鈥 service to help schools and districts find digital tools, many educators won鈥檛 have access to such supports, nor will they have the time to pore over multiple websites to find research-based tools.

Second, the K-12 educational technology marketplace is massive and growing at rapid-fire pace. Stacey Childress, CEO of NewSchools Venture Fund, recently wrote that investments in K-12 technology companies ballooned from roughly $91 million in 2009 to $643 million in 2014. The research community just can鈥檛 keep up with this dramatic expansion of companies and digital offerings. Products without any research (much less rigorous research) will continue to be available to teachers and find their way into classrooms.

Third, even if rigorous research happens, if it regards digital tools as a treatment (akin to a pill), it may overlook a critical factor: how any technology鈥檚 effects depend on the interaction between technology, teachers, pedagogy, and the context in which it is all happening. Even if a tidy randomized trial shows positive impacts for a digital tool, it is still important to consider how any given tool fits into the entire instructional program of classrooms and schools.

Given all this, helping teachers, schools, and districts learn how to generate and use evidence themselves may be a promising path toward injecting more evidence into decisions about digital tools for the classroom. Several initiatives, methods, and tools already available seem a logical place to start. Examples include:

The Proving Ground initiative at Harvard University鈥檚 Center for Education Policy Research helps districts and charter school networks design and use a deliberate, analytical approach to gathering and using evidence to test digital tools they might adopt systemwide.

The Ed Tech Rapid Cycle Evaluation (RCE) Coach, created by Mathematica Policy Research, gives schools and districts a process to follow and a tool to evaluate educational technology. The tool walks practitioners through a five-step process covering everything from planning an evaluation to summarizing the results. (Program materials say the typical RCE lasts three months from start to finish.)

The Carnegie Foundation for the Advancement of Teaching has created a range of resources. It also hosts an annual summit to help educators and others use a problem- and user-centered approach to learning and improvement, which leverages rapid testing and networked learning communities, to better their classrooms and schools.

More homegrown examples are also popping up. In Colorado, districts grappling with PL have formed a network to problem-solve issues jointly. The regional support agency that coordinates the network walks teachers through a Plan-Do-Study-Act inquiry cycle focused on a particular PL problem, with the goal of building this analytic process into their daily work.

At this point, we can鈥檛 say for sure that these initiatives, methods, and tools really work. But teachers鈥 hunger for guidance and information on digital tools in a fast-changing tech landscape certainly suggests that these and other approaches toward practice-based evidence generation would be useful initiatives for districts and partners to explore in earnest.

Did you use this article in your work?

We鈥檇 love to hear how 蜜桃影视鈥檚 reporting is helping educators, researchers, and policymakers.

Republish This Article

We want our stories to be shared as widely as possible 鈥 for free.

Please view 蜜桃影视's republishing terms.





On 蜜桃影视 Today