high-dosage tutoring – ĂŰĚŇÓ°ĘÓ America's Education News Source Mon, 10 Nov 2025 20:49:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png high-dosage tutoring – ĂŰĚŇÓ°ĘÓ 32 32 Lessons from a Failed Texas Tutoring Program /article/lessons-from-a-failed-texas-tutoring-program/ Mon, 10 Nov 2025 11:30:00 +0000 /?post_type=article&p=1023085 By the fall of 2021, predictions of steep declines in students’ learning due to pandemic school closures had come true. Gaps between the highest and lowest learners were widening. 

That’s when a large suburban school district in Texas, flush with COVID relief funds, signed a contract with a virtual tutoring provider to deliver extra help to students in 28 schools who had fallen below grade level. Research showed that could produce significant gains for students and was far more effective than on-demand models.

But the district’s program , according to a recent study from Stanford University’s National Student Support Accelerator, which focuses on studying and expanding effective tutoring. Students even lost ground in reading and would have been better off with “business-as-usual” support, like small group instruction or using a computer program for extra practice. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Experts view the findings as a cautionary tale of how tutoring can go wrong.

The district had to wait on background checks for tutors, many students were still chronically absent and the tutoring sessions often conflicted with other lessons or special events. As a result, students didn’t receive the 30 hours or more required under a mandating tutoring for those who failed the annual state test. Instead of five days a week as planned, 81% of the students attended tutoring three or fewer days, and most students worked with a different tutor every time they attended a session.

The findings reinforce the importance of protecting the time students are supposed to receive tutoring, said Elizabeth Huffaker, an assistant professor of education at the University of Florida and the lead author of the study.

High-dosage models — featuring individualized sessions held at least three times a week with the same, well-trained tutor — can still “drive really significant learning gains,” she said, “but in the field, things are always a little bit more complicated.”

For parents, the Stanford study can help explain why children might not make gains, even when their district offers extra help, said Maribel Gardea, executive director of MindShiftED, a nonprofit advocacy group and network of about 5,000 parents in the San Antonio area. Despite the billions states received in relief funds, many students still haven’t reached pre-pandemic levels of performance.

“We knew that high-dosage tutoring was one of those things that was proven,” Gardea said.  “There was research, but we never saw those results.”

She urges districts to include parent groups like hers in planning tutoring and choosing providers. But she added that too many parents are unaware their children are behind, much less equipped to judge whether a program is set up for success. 

“The trust has been lost for such a long time,” she said. “Parents just send their kids to school and they hope for the best.” 

‘It’s logistics’

The results add to a growing body of research at a time when tutoring has shifted from being viewed as an emergency stopgap to an ongoing teaching strategy, according to released last week from Whiteboard Advisors, a consulting organization. 

The authors’ interviews with state and local education leaders, researchers and tutoring providers showed that while many schools lean toward in-person tutors, “effective virtual models persist” in many districts. Going forward, they expect more schools to use tutoring as a pipeline for recruiting and training new teachers.

Districts have learned a lot about tutoring since that first, full year back after school closures, one in which districts saw staff shortages, record levels of absenteeism and disruptive behavior from students. have passed legislation to support tutoring or provide at least some short-term funding to keep programs running now that federal relief funds have expired. Some districts, including , are designing contracts that reward tutoring providers with more money when students pass tests or make other significant gains.

Recent shows an increase since December 2022 in the share of schools offering high-dosage tutoring, from 37% to 42% — especially in the South. But the results of the study show that just giving tutoring a high-dosage label doesn’t mean students will receive the help they need.

“It’s logistics,” said T. Nakia Towns, chief operating officer at Accelerate, which funds research on tutoring and other recovery efforts. “You have to have the scheduling. You have to have the identification of the students.”

High mobility, absenteeism

To encourage the tutoring provider and the Texas district to participate in the study, the researchers didn’t identify them. But an official with the district, who spoke on background, told ĂŰĚŇÓ°ĘÓ that one reason tutoring didn’t start until the middle of the school year was because leaders waited for winter test data to ensure they were selecting students who needed the most help.

The state required tutors to pass federal background checks, a process that added delays, and it took time to find bilingual tutors and those with special education experience. Students who were furthest behind academically “were also the same students who had high mobility or high absentee rates,” the official said. 

School assemblies interfered with the tutoring schedule, and some principals, the official said, were less supportive of virtual tutoring in general. Now, he said, the district offers in-person afterschool tutoring as one option, but also builds intervention time into the school day for all students.

Tutoring during school hours increases the chances that students will actually get the service, but the model creates some challenges, Huffaker said. Tutoring is now “competing with other instructional practices during the school day.” 

That includes lessons that teachers are presenting to the whole class and don’t want students to miss, the district official added.

Recent findings from another tutoring study, the , provides further proof that the more tutoring students receive, the greater their gains. But the “bad news,” according to the researchers, from the University of Chicago and MDRC, was that students often didn’t receive as much tutoring as originally planned.

“Conversations with the operators suggest schools felt they simply had too many competing demands on limited instructional time,” the authors wrote.

Recent research from the University of Chicago and MDRC reinforced the finding that the more tutoring students receive, the greater the learning gains. (University of Chicago/MDRC)

Another takeaway from the Stanford study is the “critical role” of relationships between tutors and students, said Rahul Kalita, co-founder of Tutored by Teachers, a virtual provider with a network of over 6,800 certified teachers. In the , one of its largest clients, students are approaching pre-pandemic levels in reading, and nearly 70% of third graders passed a reading test this year required for promotion to fourth grade. 

Without “consistent, human-to-human connection,” Kalita said, results will be similar to on-demand “edtech tools” that researchers have found to be ineffective.

‘Start with the curriculum’

Not only did Texas students not receive enough tutoring, the research team found a weak relationship between their sessions and the material they needed to know for tests. Tutors covered about a third of the math standards and only about half that in reading. 

But this is an area where some tutoring companies have shown improvement, said Towns, with Accelerate. More successful providers, she said, “really start with the curriculum,” and hire experts with “deep knowledge around literacy or math.” 

now show that remote tutoring can be just as effective as in-person programs. That’s why she encouraged districts not to give up on virtual models.

“Coming out of the pandemic,” she said, “everybody was just like, ‘Let’s try anything. Anything is better than nothing,’ and in fact that’s not true.” 

]]>
Experience Shows High-Dosage Tutoring Provides Lasting Impact for Student Success /article/experience-shows-high-dosage-tutoring-provides-lasting-impact-for-student-success/ Fri, 02 Aug 2024 10:30:00 +0000 /?post_type=article&p=729839 This article was originally published in

When schools closed in 2020 due to the COVID-19 pandemic, the impact was deep and long lasting. In Maryland schools, test scores fell to an all time low, particularly in math.

In 2021, counties received funds to provide high-dosage (intensive) tutoring to students to close gaps caused by school closures. This funding ensured that students consistently engaged in targeted, supplemental instruction at least two to three times per week for 30-45 minutes per session.

In fall 2021, the Reach Together Tutoring Program (RTTP), a partnership program of the George and Betsy Sherman Center at the University of Maryland Baltimore County collaborated with Baltimore City Public Schools to provide high-dosage tutoring that helps students access and master rigorous, grade-level mathematical concepts.


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


The partnership was not new. In fact, UMBC staff and students have long worked with educators to not only support professional development and community programming, but also to educate, develop, and place UMBC graduates in teaching positions in Baltimore through the Sherman Scholars Program. Our growing partnership with city schools, ESSER (Elementary and Secondary School Emergency Relief Fund) funding, and our access to college students, allowed us to scale our previous efforts.

The program supports students in second through eighth grade who are selected based on diagnostic assessment scores. RTTP participants scored in the bottom quartile, which equates to two or more grade levels below where they should be. Tutoring occurs during the school day utilizing the “personalized learning” block, in order to minimize disruption to the core curriculum.

What makes RTTP unique is the hiring of UMBC students as math coaches. Math coaches work with a small group of students two to three times a week during the academic year for approximately 24 weeks. Using an acceleration model, coaches focus on high-leverage foundational skills that align to grade-level content. They receive extensive preservice and ongoing training highlighting cultural competency, mathematical mindsets and student engagement.

Our mission is simple: “We will facilitate purposeful math experiences that enhance each student’s math identity and accelerate their learning trajectory.”

In 2021, we were in four Baltimore City Schools serving 355 students and had 85 UMBC math coaches. Fast forward to today and we just completed our third year of programming in nine Baltimore City schools (Arundel Elementary, Cherry Hill Elementary Middle School, Lakeland Elementary Middle School, Westport Academy, Park Heights Elementary, Dickey Hill Elementary Middle, Fallstaff Elementary Middle School, Bay Brook Elementary Middle and Curtis Bay Elementary) serving 644 students.

Since 2021, UMBC math coaches have completed 45,586 tutoring sessions. This spring we partnered with the city schools to increase capacity and serve more students through the with a focus on grades six-eight. We are looking forward to expanding to 10 schools in school year 2024-25.

Is it working? We partnered with faculty from UMBC’s Public Policy and Education departments to complete a two-year program evaluation. Results indicate that participants of RTTP made greater progress when looking at test score gains and percentile gains from beginning of year to end of year when compared to non participants. Student survey data indicates that 85% of students felt more confident in math after participation in RTTP, with one eighth grade student from Cherry Hill saying, “I could get help, and if I got it wrong, they didn’t put me down.”

But there’s more. RTTP has not only supported students in Baltimore City, but has created a lasting impact and shifted career trajectories for UMBC students. Math coaches are undergraduate, graduate, and doctoral students from all majors, races, genders, and ethnicities.

We increased from 85 math coaches in school year 2021-22 to over 165 in school year 2023-24, when more that 1,100 UMBC students applied to be a math coach. Candidates from the Sherman Scholars Program participate in RTTP as part of their academic learning experience, giving them a hands-on opportunity to engage with students prior to beginning their teacher internship year.

Over the last three years, we have had several math coaches decide that they wanted to become teachers. They earned a master of arts in teaching and are now teaching in schools where they tutored.

Rehema Mwaisela is one such scholar who, after her first year as a math coach in her junior year at UMBC, said, “Before I was math coach in Baltimore City, I thought I wanted to be a mathematician, or just keep with math in grad school, but now I know my place in math is empowering Baltimore City scholars as much as I can with mathematical knowledge.”

She now teaches at Westport Academy. RTTP has created an exciting space where community engaged scholarship and partnership intersect and the impact is complex and far-reaching.

is part of States Newsroom, a nonprofit news network supported by grants and a coalition of donors as a 501c(3) public charity. Maryland Matters maintains editorial independence. Contact Editor Steve Crane for questions: editor@marylandmatters.org. Follow Maryland Matters on and .

]]>
As Relief Funds Expire, Harvard’s Kane Says ‘Whole Generation’ Still Needs Help /article/as-relief-funds-expire-harvards-kane-says-whole-generation-still-needs-help/ Wed, 07 Feb 2024 21:46:53 +0000 /?post_type=article&p=721934 Harvard University researcher Tom Kane stood before a captive audience at Washington’s Omni Shoreham hotel last Wednesday, just hours after dropping the report everyone was talking about. 

Offering the yet at students’ recovery from pandemic learning loss, the report showed that students actually made impressive academic gains last school year. But achievement gaps grew wider during the pandemic, and students in some high-poverty districts performed worse than they did before COVID. 

“There’s a whole generation of kids, especially in poor districts, that are half a grade level or more behind still and are going to need extra help,” he said.

The crowd, composed of some of the nation’s top tutoring providers and researchers, wondered what they should do next. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


His answer satisfied few. Despite the high stakes and the imminent end of federal relief funding, many schools still don’t know which interventions are working. As states and districts rushed to hire tutors and sign contracts, many failed to record which programs helped students the most. 

“It is amazing that the systems that we entrust with managing our own children’s learning are terrible at learning themselves,” he bluntly told attendees at the event, organized by Accelerate, an organization that works to scale high-dosage tutoring. “It is so frustrating to hear those questions being asked now when the federal dollars are about to run out.” 

Those dollars — $122 billion from the 2021 American Rescue Plan — expire at the end of September. At a time when the research shows many students are still far behind, the U.S. Department of Education is a chance to spread out use of remaining funds until March 2026, especially if they use it to reduce absenteeism, provide intensive tutoring and extend learning time. But Kane said states should also seize the opportunity to better track which recovery strategies are helping students the most. 

“I don’t mean to complain about water under the bridge, but let’s try to think of this going forward,” he said. 

Education department officials say they’re trying. , all districts will have to provide more details on how the funds were spent. Previously, districts had to show whether they provided summer learning, afterschool programs or tutoring to address learning loss. Now they’ll how much they’ve spent on those areas as well. 

Districts also have to report how many students participated in high-dosage tutoring and “evidence-based” summer and afterschool programs and whether they came from traditionally disadvantaged groups such as low-income students, English learners or students with disabilities. And if states want to apply for an extension, they’ll need to submit a letter explaining how they would use the funds to reach the neediest students. 

“We do want to know more from states and from districts about how they’re putting these dollars to use to support academic recovery,”  Roberto Rodriquez, an assistant secretary at the U.S. Department of Education, told ĂŰĚŇÓ°ĘÓ. “Are we investing in some of these evidence-driven strategies?”

Roberto Rodriquez, the U.S. Department of Education’s assistant secretary for planning, evaluation and policy development, answered questions from Janice Jackson, chair of the board at Accelerate at the organization’s conference high-dosage tutoring. (Accelerate)

‘Students won’t have caught up’

Kane cited a previous lack of “federal leadership” on collecting such information and said states were hesitant to impose additional requirements not mandated by the 2021 relief fund law.

“States were in the back seat, watching districts make decisions on how to spend the money. They’ve been slow to get in the front seat,” he told ĂŰĚŇÓ°ĘÓ. He urged federal officials to “publicly challenge states” to continue recovery efforts. “As the recovery dollars are tapering down, it’s clear students won’t have caught up.”

According to , states had about $53 billion remaining in American Rescue Plan funds last November. Rodriquez said the department has received a lot of interest from states on extensions, but no applications yet. 

Even if they don’t get more time to spend the funds, districts still have this summer to focus on students who are furthest behind, Kane said. He recommended that states require districts to inform parents whether their children are below grade level in reading and math and then serve all who sign up for summer school.

Most parents are “fairly removed” from discussions about relief funds, said Bibb Hubbard, founder and president of Learning Heroes, a nonprofit that explains achievement data to parents. But she said they shouldn’t be misinformed about whether their children are far behind.

“They often think that’s someone else’s child, not their own,” she said. The , she said, reinforces how important it is that “parents know exactly where their children are academically at the end of the school year.” 

The Harvard study was conducted in partnership with Stanford University sociologist Sean Reardon. The district-level results show that students made up  a third of the learning they lost in math and a quarter of the loss in reading. This was more than students typically gained in a year prior to the pandemic. Alabama, for example, saw the most improvement in math and was the only state to exceed pre-pandemic achievement levels. 

Three states rebounded past 2019 performance in reading: Illinois, Louisiana and Mississippi. Black students made more progress between 2022 and 2023 than white and Hispanic students, but the achievement gap between white and Black students was still larger last year than it was before the pandemic. 

Despite the growth, most students performed below 2019 achievement levels, especially in high-poverty districts. In six states, the gap between high- and low-poverty districts grew wider in reading between 2019 and 2023. 

Virginia was one. 

“We were struggling to catch up, much less get a step ahead,” state Superintendent Lisa Coons told ĂŰĚŇÓ°ĘÓ. She added that officials “expect persistent learning loss.”

To supplement declining relief funds, the state added last fall for tutoring, improving literacy and reducing chronic absenteeism. While she said her state would likely ask for an extension, she wants districts to move away from a “buffet” of initiatives and choose programs that fit the effective models outlined in a new state . The resource provides details on how to choose students for tutoring and fit sessions into the school schedule.

“We need to continue to prune,” she said, “and work on the things that we know are showing results for our students.”

]]>
More Students Need Great Tutors — But Here’s Why Our Tutoring Moment Could Fail /article/analysis-why-this-tutoring-moment-could-die-if-we-dont-tighten-up-the-models/ Tue, 21 Nov 2023 13:30:00 +0000 /?post_type=article&p=718045 In a new Aspen Economic Strategy Group , Jonathan Guryan and Jens Ludwig argue schools are bungling the rollout of high-dosage tutoring: “When schools are faced with the possibility of change, they tend to do fewer of the hard things that will help students and more of the easier things.” 

Schools won’t change the schedule, they redeploy would-be tutors as aides making copies, etc. It’s troubling. And headlines like  and  and  also aren’t helping.   

So what happens next?  

In a March column in ĂŰĚŇÓ°ĘÓ, Kevin Huffman warned: “I worry that policymakers will pretend high-dosage tutoring is happening at scale and then, when student outcomes do not measurably improve, declare that it hasn’t worked.” 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


So what’s the answer for scaling up at quality? Proven good models need to become great, so when they scale and inevitably dilute, they “merely” retreat back to: good. We must make it easier to be a good or great tutor.  And that requires unusual “within program” research and development. In an essay published , the Overdeck Foundation’s Pete Lavorini made that very case, noting there are “a number of exciting innovations underway to lessen the implementation burden without sacrificing effectiveness, by adjusting the high-impact tutoring ‘formula.’ ”

Let me describe what tutor innovation looks like in real life. First, you need decent scale. When I started Match Tutoring in 2004, we had 45 tutors (living literally inside the school, on our ). My friend, economist Matt Kraft, wrote in The74 how measuring that program’s impact launched his career studying tutoring. But 45 people is just not enough educators to easily A/B test “what works for individual tutors.”  

Last year, I met a math educator, Manan Khurma, who founded a math tutoring company in India called Cuemath, with 3,300 tutors. I asked whether I could, with a few colleagues, (carefully) try new ideas, to see what works for his thousands students across the world? Manan said yes, he was interested in anything empirically valid that made tutoring better.  

Scale, check.  

Second, you need a “problem of practice.” We zoomed in on a common problem, familiar to many educators: student talk!  Some kids, especially if confused, are reluctant to speak up, to share what they’re thinking. Common Core and the National Council of Teachers of Mathematics both emphasize the need for math discourse, but teacher training in this area hasn’t led to kids speaking up more.  

How to change this?  

My colleague Carol Yu wondered if a Fitbit type device — a “Talk Meter” — might help, or would it annoy kids, or teachers?

We started small, enlisting a few kids and tutors to try a prototype. An AI bot would patrol a tutorial, and then, roughly 20 minutes into a tutorial, a little box would pop up on the screen. It told teacher and student what the talk ratio was, just like a Fitbit offers your step count when you glance at it. If either party was talking too much, they’d adjust.  

The early signals were promising! So we ran a rigorous randomized control trial with 742 Cuemath teachers, and enlisted some research help, from Stanford’s Dora Demszky. This is often a third step: Enlist a scholar to bolster your measurement efforts.  

The results were strong. In a forthcoming journal article, Dr. Demszky will describe the full experiment, but the punchline is student reasoning increased by 24%, and the talk ratio converged on 50-50 between kid and tutor — exactly what we wanted. Tutors asked better questions, and “built” on what kids said.  Both students and tutors liked the Talk Meter (it led to lighthearted, warm interactions as well). Introverts particularly improved.    

Fourth, you can layer experiments on top of one another. One we’re trying now is whether one-on-one coaching would build on TalkMeter success.  

Should other programs build their own TalkMeters or tutor coaching efforts? That’s not our claim (though when I shared the TalkMeter result with friends leading other prominent tutoring organizations, several said “OMG — we should do this.”) There’s a key distinction that matters for scale. A technology intervention like TalkMeter is context specific. And a human intervention like coaching is talent specific.

I learned that lesson 14 years ago. We launched a teacher coaching program in New Orleans, with a wonderful educator named Erica. I enlisted Matt Kraft to measure it. He found large gains for teachers. Then we added coaches. The impact was diluted — a finding he wrote about .  

The point here is that high quality experiments, often in partnership with scholars, can help specific program models vault to greatness, as a way to counteract inevitable dilution at scale.   

While we co-sign on the Guryan/Ludwig desire to “push” schools to do hard things, we also should make hard things easier, to have “good” impact by combining “great programs” with “merely solid” execution. (Of course, nothing can overcome shoddy execution).  

That’s the only way this high-dosage tutoring movement will survive and expand. 

]]>
Harvard Economist Offers Gloomy Forecast on Reversing Pandemic Learning Loss /article/harvard-economist-offers-gloomy-forecast-on-reversing-pandemic-learning-loss/ Thu, 14 Jul 2022 11:15:00 +0000 /?post_type=article&p=692836 Two years of debate had raged over the scope and severity of COVID-related learning loss when, this spring, Harvard economist Tom Kane contributed some of the most compelling evidence of the pandemic’s effects on K-12 schools.

Along with collaborators from Dartmouth, the CALDER Center at the American Institutes for Research, and the nonprofit testing group NWEA, Kane released incorporating pre- and post-pandemic testing data from over 2 million students in 49 states. Its conclusion: Remote instruction was a “primary driver of widening achievement gaps” over the last several years, with schools serving poor and non-white students suffering some of the greatest setbacks. 


Get stories like this delivered straight to your inbox. Sign up for ĂŰĚŇÓ°ĘÓ Newsletter


Overall, Kane and his co-authors found, high-poverty schools were more likely than others in the same district to stay remote throughout the 2020-21 school year; among all schools that stayed remote for longer, students at high-poverty schools showed much worse declines in math scores. And they calculated that some school districts would have to spend every dollar of their federal COVID relief money on academic recovery efforts to have any hope of making up the lost ground.

As Kane observed for the Atlantic, local education authorities are required to use only 20 percent of those funds on pandemic-specific remediation. And there is sufficient reason to doubt that even the most promising educational interventions, such as personalized tutoring, can be delivered at the necessary scale to reverse the damage inflicted by COVID. Even the Biden administration’s recently announced campaign to recruit 250,000 new tutors and mentors is at least several months away from being fully realized.

Kane, the faculty director of Harvard’s Center for Education Policy Research, has spent decades carefully evaluating the effectiveness of school improvement efforts. A Council of Economic Advisors staffer during the Clinton presidency, he has studied school accountability systems, teacher recruitment policies, and the effects of affirmative action throughout long stints in both academia and think tanks like the Brookings Institution. His research on teacher evaluation inspired a half-billion-dollar initiative launched by the Bill & Melinda Gates Foundation to lift classroom performance in large school districts around the country.

Now he’s hoping to work with state and district leaders to combat an educational disaster whose effects, he says, are still not well understood. While policymakers may now have a loose idea of the challenges facing educators and families, the policies they’re currently reaching for will likely prove inadequate as a solution.

“Once that sinks in, I think people will realize that more aggressive action is necessary,” Kane said. “In the absence of that, it’s hard to blame local folks for not taking more aggressive action because they have no way to know that what they’re planning is nowhere near enough.”

This interview has been edited for length and clarity.

ĂŰĚŇÓ°ĘÓ: How do your findings and research design differ from earlier studies that have looked at pandemic-related learning loss? I’m thinking specifically of last year’s study conducted by, among others, Brown University’s Emily Oster, which also pointed to really steep losses associated with the switch to virtual learning.

Thomas Kane: There are at least two ways that this paper is different. The first is that we’re able to estimate magnitudes of losses in a way that’s comparable to the effect sizes of [educational] interventions. In that [Oster] study, they can focus on changes in proficiency rates on state tests. Each state has its own cut score, so the magnitude of the changes in proficiency rates depends on whether that cut score is near the middle or near the tail of the test score distribution. If my cut score is near the middle, even a small decline in achievement can mean a big swing in proficiency. But if my cut score is at the tail, even a large decline in test scores can show up as a small change in the percentage of people who are proficient. 

‘Right now, there’s no package of efforts that I’d be confident would be enough to close the gap. Absent that, it’s no wonder that politicians aren’t willing to invest political capital in it.’

So while that study could qualitatively describe what was happening — in areas that were remote for longer periods, proficiency rates declined — they really couldn’t characterize the magnitude of the decline in any way that was comparable to the effects sizes, which I think is critical. As we’ve argued, it’s not at all surprising that there were larger losses in places that were remote for longer periods. It’s the magnitude of the losses that’s startling.

This design also lets you make comparisons within districts, as well as between districts, right?

That’s another big difference between our paper and what’s out there now. The [Oster] paper was focused on district proficiency rates, and what they found was that districts with larger shares of minority students and high-poverty schools had larger losses. But it could have been, for instance, that the implementation of remote learning was just weaker in those districts — districts with a higher share of students in poverty may have seen bigger declines in achievement, but the losses could have been similar in the high- and low-poverty schools in those districts.

By being able to look within districts, we were able to test whether the number of weeks of remote instruction had disproportionate impacts on high-poverty schools and minority students in those districts. Our answer was pretty clearly yes, there were bigger losses. And it wasn’t just because the urban districts had a harder time implementing remote instruction; even within those districts, the higher-poverty schools lost more.

You used the word “startling” to describe the learning loss. Were you expecting to see effects of this size?

We went in without any clear expectations on the magnitude of the impacts we would see. The reason why I called it startling was because I know that there are very few educational interventions that have ever been shown to generate a .45 standard deviation impact [a common measure showing the difference in any population from the statistical mean; they can be loosely converted into other units, such as learning time or dollars spent] on achievement. Yet that’s the size of the loss that high-poverty schools that were remote for more than half the year sustained. So it was startling because when we compare the impact estimates of remote learning to the potential impact of the available interventions, it’s clear that there is no one thing that we could say, “If all districts did this and implemented it with fidelity, it would eliminate the gap in one year.” 

For instance: In a review of the pre-pandemic research, tutoring has been found to generate a gain of about .38 standard deviations. Well, you could provide a tutor to every single student in a high-poverty school that was remote for half the year and still not close the gap. You could get close, but you wouldn’t close that gap. And we know that districts are never going to be able to hire enough tutors to provide one to every student in a high-poverty school, let alone deliver that tutoring at the level of quality as these programs evaluated in the research. That’s why it was startling — not just because it conflicted with our prior expectations, but because when we saw it, we realized that we couldn’t come up with a long list of interventions that yield effects of this size.

So what can schools and districts realistically be expected to do in this situation? 

We can’t be thinking of this as a one-year catch-up. If we really are committed to making students whole and eliminating these losses, it’s going to be multiple years. There are other interventions that have been shown to have effects, it’s just that no single intervention gets you all the way. 

One example is double-dose math. There’s , and , that found that an extra period of math instruction over a whole year generates about .2 standard deviations. 

“You could provide a tutor to every single student in a high-poverty school that was remote for half the year and still not close the gap. You could get close, but you wouldn’t close that gap. And we know that districts are never going to be able to hire enough tutors to provide one to every student in a high-poverty school, let alone deliver that tutoring at the level of quality as these programs evaluated in the research.” 

So more districts should probably be thinking about something like that, especially in high-poverty schools. But like tutoring, increased math instruction requires staff; you can’t double the number of math classes students take without increasing the number of math teachers. Again, districts should be considering doing some of that, but it will also have constraints on the scale they can implement. 

Another possibility, which a lot of districts are already planning for, is summer school. There are studies suggesting positive impacts of summer school. But [the effects are] small. The big challenge with summer school is getting kids to attend regularly, because it’s viewed as optional learning time. That’s not a reason not to scale up summer school, it’s just that we shouldn’t think that doubling or even tripling the percentage of kids going to summer school is going to close these gaps. It’s not. You get a learning gain of about .1 standard deviations — around five weeks of learning — based on the pre-pandemic research.

One option that really hasn’t gotten much serious consideration, largely because of political pushback from parents and teachers, is extending the school year. If we extended the school year by five weeks over the next two years, that would obviously cover 10 weeks of instruction. I recognize that teachers would have to be paid more for that time. In fact, they ought to be paid something like time-and-a-half. But that’s the kind of option that I hope will gain attention once people realize the inadequacy of the steps that they’re currently considering, like small increases in summer school or tutoring a small percentage of students. It’ll become apparent that that’s just not enough, though my fear is that it may not become apparent in time. Based on what I’m seeing, most districts are going to find that students are still lagging far behind when they take their state tests in May 2023. The danger is that if they only discover that then, and only start planning more ambitious recovery plans then, much of the federal money will have been spent already. That’s why we’re trying to get the message out about the scale of the declines, and the likely scope of the efforts required to close them, while there’s still time to act. 

Districts only need to spend 20 percent of their COVID relief funds mitigating learning loss. But you and your co-authors created a formula to determine the financial cost of reversing this academic harm, and in many cases, that figure would basically demand every dollar allocated by Washington.

We try to put the scale of the [learning] losses and the amount of aid that districts have received in the same scale. We report both as a share of an annual school district budget, which I think is a useful starting point for thinking about what it’s going to cost a district to recover. If a district has lost the equivalent of, say, 22 weeks of instruction as a result of being remote, and you’re asking what it’s going to cost to make up for that, the lower bound of the estimated cost would have to start with [the question], “What does it cost to provide 22 weeks of instruction in a typical school year?”

The answer would be whatever share of a district’s typical annual budget is spent over 22 weeks. In the paper, we use a 40-week year, under the assumption that salaries are paid over 40 calendar weeks instead of just 36 instructional weeks. And then we put the amount of federal aid that districts got on that same scale — say, what share of a typical year’s budget districts receive. We think that’s a useful starting point for people, and what they’d see is that in the high-poverty districts that were remote for more than half of 2021, the amount of aid they received is basically equivalent to — maybe a little more, but not much more than — the magnitude of their losses in terms of instructional weeks. That just means that, rather than spending the 20 percent minimum that was required in the American Rescue Plan, some districts should be thinking that they’ll need all of that aid for academic catch-up.

I have to say, this conversation is leaving me pretty pessimistic that some of this lost ground can ever be fully recovered. Without asking you to look into a crystal ball, is that concerning you as well?

Yes, but here’s a more hopeful spin: A friend of mine sent me a political ad for one of the gubernatorial candidates in Rhode Island, Helena Foulkes. She says, “I’m running for governor, and my top priority is restoring students’ achievement, and if I fail to restore achievement, I’m not going to run for reelection. Hold me accountable for whether we catch kids up.” 

I would hope more politicians take that pledge, and that the way to judge mayors and school board members and governors over the next couple of years is on whether they succeed in restoring students to their pre-pandemic levels of achievement. It would be that kind of accountability that would wake people up to the need for more aggressive action now. It’s one thing to read these reports about achievement losses nationally, but it’s another thing to see that your own schools, locally, followed exactly the pattern of this report. 

Most districts have seen the statistics from [the Oster paper] and know that their proficiency rates have declined by 10 or 15 percentage points. But that kind of statistic, as we’ve discussed, doesn’t really convey the severity. We’d like to provide districts with the tools to gauge the losses in the kinds of units — like standard deviations, or dollars spent, or weeks of instruction — that they could compare to the effect sizes of educational policies. That could make it easier for people to translate their local losses into a package of interventions of equivalent size. In , I tried to put both the learning loss and the intervention effects into instructional weeks rather than standard deviation units to make it easier.

I think that needs to happen. Local decision makers need to see the scale of their students’ losses in ways that are more readily comparable to the expected effect sizes of the interventions they have to choose from. Once that sinks in, I think people will realize that more aggressive action is necessary. In the absence of that, it’s hard to blame local folks for not taking more aggressive action because they have no way to know that what they’re planning is nowhere near enough. It certainly sounds impressive to say, “We’re going to double our summer school enrollment and provide a tutor to 5 percent of the students in our schools.” All of that would reflect more than the catch-up effort in a typical school year, but it’s only when you compare those to the effect sizes for those interventions, and the magnitude of their losses, that you realize that it’s nowhere near enough. So we’ve got to make that lack of proportionality clearer to local decision makers, and not just in these national reports.

Another recent study using MAP data found that U.S. students had sustained as much academic damage from school closures as kids in Louisiana suffered after Hurricane Katrina. But after the storm, the whole New Orleans school district was fundamentally restructured, such that it’s now mostly composed of charters. What do you think of more drastic attempts to change the organization of schools and districts? 

Here’s one reason why this challenge is greater — and it’s actually related to the situation in Boston. I think that if people were confident that a state takeover would produce the big improvements that are necessary in Boston, there would be political will. The problem is the uncertainty: “If we take this very difficult step, is it going to produce the results we’re hoping for?” 

If some district said, “We’re a high-poverty district, and we were remote for more than half of 2021. What should we do?” I could list a few things they should be trying, but I couldn’t point to a package that would definitely close the gap because it’s an unprecedented gap. There is one thing I think could provide the hope and ammunition that would generate political will: We could organize for the next few months around a set of interventions to be launched in the spring of 2023 and then find a few places that would be willing to try that package of things. If we could evaluate those and generate some results early in the summer of 2023, we could then say, “Here is a set of interventions that, if you implement them, it’ll get you a long way toward closing the gap.” And I think we’d have an easier time persuading people to use the political capital you need to invest in that.

So to anyone reading this interview: If there are districts or states that are willing to implement some really creative catch-up strategies next spring and want to contribute to an evidence base that the rest of the country can use, I want to work with you! Right now, there’s no package of efforts that I’d be confident would be enough to close the gap. Absent that, it’s no wonder that politicians aren’t willing to invest political capital in it. But if we had that, we could all get behind advocating for them. It would help everybody if a small set of districts would step forward and try to provide a model for the rest of the country to copy. 

“The way to judge mayors and school board members and governors over the next couple of years is on whether they succeed in restoring students to their pre-pandemic levels of achievement. It would be that kind of accountability that would wake people up to the need for more aggressive action now.”

The clock is ticking, and I think we’d have to do it next spring. I’m sure we could design a study and get results out quickly to people about the type of effort that would generate enough [learning] gains. But there shouldn’t be just one model we’re trying — there should be multiple approaches that we systematically try next spring, and ideally, one or two of them will prove to deliver the effects we need. And then we could organize advocacy around those.

So what comes next for your research in this area?

We’ve been working with a group of 14 districts that are giving us data on which kids they provided tutors to, which kids got double-dose math, and various other things over this past school year. We’ve been working with the NWEA data and hope to have a report out in August laying out the effect sizes that districts got for the interventions they attempted in 2021-22.

]]>