Opportunity Wasted: Second-Round ESSA Plans Get Largely Lackluster Reviews From Independent Experts
States largely squandered the opportunity to create strong, innovative education plans through the Every Student Succeeds Act, a bipartisan group of independent reviewers found.
鈥淭his is really challenging work. No state has seemed to figure out how to do it well, across the board, for every student, in a comprehensive manner and in a sustained way over time,鈥 Erika McConduit, president and CEO of the Urban League of Louisiana and one of the reviewers, told 蜜桃影视.
The reviewers, a bipartisan group of more than 45 advocates, joined together in an effort led by the Collaborative for Student Success and Bellwether Education Partners. They rated each plan in nine areas, on a scale of 1 to 5.
Among the 34 states submitting plans in the second round, nine got a 5 in at least one category, and only Indiana received a 5 in two categories. Compare that to the first round, in which six of the 17 plans got a top rating in at least one category, with three states getting 5鈥檚 in more than one category.
鈥淲e have an opportunity to do two things: focus on equity and focus on excellence. If we really, at a very high level, saw ESSA as an opportunity to do those two things 鈥 then this is really a win for states across the country. That鈥檚 the main question that should be asked,鈥 McConduit said.
The second-round plans were not any stronger, despite the additional time to get input and write them ahead of the later September submission deadline, the reviewers found.
鈥淲e were hoping to see more from the states,鈥 Chad Aldeman, a principal at Bellwether and one of the leaders of the review process, said.
The plans overall were confusing and lacking detail on how the performance of vulnerable student subgroups 鈥 such as students of color and special education students 鈥斅爓ill be included in the plans, and how all of the specific elements of school ratings come together into final scores, he said.
鈥淵ou鈥檇 be really hard pressed to figure out what a state is actually doing in a lot of instances,鈥 Aldeman said. Four to five experts, plus the Bellwether staff, reviewed every plan, and there were some 鈥渢hat all of us could not make heads-or-tails sense of,鈥 he said.
The reviewers criticized California鈥檚 dashboard accountability system, for instance, as 鈥渃omplicated and incomplete.鈥 It relies on a 鈥渃olor-coded, 25-square performance grid for each indicator that will create a number of arbitrary cut points between performance levels,鈥 reviewers wrote.
The plan also doesn鈥檛 fully articulate how those indicators will combine to identify schools in need of support or what it will do about low performance, the reviewers said, giving the state a 1, the lowest rating, for how it will identify schools in need of support.
And on incorporating individual subgroups, reviewers dinged Florida for failing to include an indicator of English language proficiency in its school identification process, nor does it mention providing any accommodation or supports to the significant portion of the student body who are English language learners, the reviewers wrote.
The Education Department at first took a more hands-on review process, warning Delaware that its long-term goals weren鈥檛 ambitious enough, before being scolded by Senate Education committee chairman Lamar Alexander and backing off. It has since approved 16 of the 17 first-round plans that were filed this spring, with just Colorado鈥檚 outstanding.
The Department of Education under Secretary Betsy DeVos issued a new state plan template in March that asked fewer questions, and 鈥淚 think states ran with that and chose not to submit much information,鈥 Aldeman said. Republicans in Congress also threw out Obama-era accountability regulations that would have required states to address many of the issues the reviewers raised.
Some state leaders could point to education plans beyond what they submitted to the federal government, but there鈥檚 no way to hold them to those pledges, Aldeman said.
鈥淚 can understand why states wouldn鈥檛 want to submit lots of details to the feds, because then they鈥檒l be held accountable for that. The flip side for that is, then, for efforts like ours, they look bad,鈥 he added.
The state plans reflect what was required as part of their technical submissions, and many are going beyond what they submitted to the federal government, said Kirsten Carr, senior program director of student expectations at the Council of Chief State School Officers. CCSSO represents the top state education officials, who wrote the plans.
鈥淚t鈥檚 hard to say they squandered an opportunity based on one review,鈥 Carr said.
Several groups have put out reviews of ESSA plans in the months since the first round was submitted to the Education Department, and others have been more positive. The Thomas B. Fordham Institute focused on states鈥 use of student growth and clear ratings; 20 of the 51 plans got 鈥済ood鈥 or 鈥済reat鈥 ratings from that review.
State leaders will take 鈥渁ll critical feedback鈥 under advisement and continue to improve, Carr said.
鈥淯ltimately, the opinions that really matter the most are those from the educators, students, and advocates within their states,鈥 she added. State education leaders were required to solicit public input on their plans and seek governors鈥 approval before submitting to the Education Department.
The nine categories the reviewers used for both rounds of plans were: goals, standards and assessments, indicators, academic progress, all students, identifying schools, supporting schools, exiting improvement status (improving enough to no longer need state interventions), and continuous improvement (learning from implementation and modifying the plan going forward).
Under ESSA, the bottom 5 percent of schools and high schools where fewer than two-thirds of students graduate on time will be identified for 鈥渃omprehensive鈥 support, while those where subgroups of students aren鈥檛 performing well will be identified for 鈥渢argeted鈥 support.
The reviews on the whole were generally lower for the second-round plans.
Among the second-round states, 13 got a 1 in at least one category. California, Idaho, and Texas got the lowest scores in two categories; Nebraska and Virginia got 1鈥檚 in three categories; and Kansas got 1鈥檚 in six of the nine categories.
Reviewers said Kansas鈥檚 plan overall doesn鈥檛 provide sufficient information and evidence to show how it will improve education in the state. Its accountability system 鈥渃ould be very confusing鈥 and state officials should work to simplify it, the reviewers added.
And, hitting on an issue that was common among states, reviewers said Kansas鈥檚 school improvement plan was a concern.
鈥淜ansas鈥 strategies for school improvement 鈥斅爌articularly for schools that do not improve after identification 鈥 are very limited and carry significant risk that these schools could continue [to do poorly] without making dramatic changes commensurate with their needs,鈥 the reviewers wrote.
Kansas has not yet received any feedback from the Education Department on areas it may need to address, Denise Kahler, director of communications and recognition programs at the Kansas State Department of Education, said via email.
鈥淎s a state, we set a very powerful vision for K-12 education two years ago that calls for a redesign of our current system. We are very focused on that work,鈥 Kahler wrote.
Finally, the reviewers raised concerns that 20 states don鈥檛 include consequences for schools where fewer than the federally required 95 percent of students take annual state tests.
Parents, frustrated with a growth in standardized testing and the rollout of the Common Core State Standards, blocked their children from taking exams; the opt-out movement involved as many as 1 in 10 students in Colorado and 1 in 5 in New York. It became an issue during the drafting of ESSA, but lawmakers ultimately kept the participation requirement.
鈥淧eer reviewers are wary that schools may return to a time when some students were intentionally not tested and their performance swept under the rug,鈥 they said in a press release.
Carr, with CCSSO, said she hasn鈥檛 reviewed the plans in enough detail to know whether the 20 number is accurate, but there may be other ways they鈥檙e addressing the testing participation issue not in the state plan.
鈥淪tates are absolutely meeting the letter of the law, and it just may be reflected in different ways in their plan,鈥 she said.
Despite their disappointment overall, reviewers were pleased that states are expanding accountability systems beyond the reading and math tests that make up the bulk of the ESSA鈥檚 testing requirements.
States are including science, student attendance, and measures of college and career readiness in their ratings, according to a release.
Reviewers were also pleased with states鈥 use of year-to-year student growth on tests, rather than a static measure of proficiency. Advocates have long said a hard focus on proficiency, as was required under No Child Left Behind, unfairly penalized some schools, particularly those with high numbers of low-income students who were learning but not yet meeting academic benchmarks.
And though school improvement overall was a problem, much as it was in the first round, three states in particular did a good job in their plans: New York, Indiana, and Rhode Island, the reviewers said.
Rhode Island 鈥渨as one of the better school improvement plans that I鈥檝e read, in the first or second round,鈥 McConduit said.
Rhode Island should be held up as an example, particularly for its use of a resource hub to share best practices for school improvement; a community advisory board that has real input, through the state education department, to guide school turnaround; five evidence-backed approaches to turnaround; and competitive grants that are open to schools that aren鈥檛 required to undertake turnaround work but want to improve.
鈥淣early every component of its plan could be a model for others,鈥 the reviewers said.
Andy Rotherham co-founded Bellwether Education Partners. He sits on 蜜桃影视鈥檚 board of directors and serves as one of the site鈥檚 senior editors.
Did you use this article in your work?
We鈥檇 love to hear how 蜜桃影视鈥檚 reporting is helping educators, researchers, and policymakers.