state report cards – 蜜桃影视 America's Education News Source Wed, 04 Sep 2024 22:08:07 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png state report cards – 蜜桃影视 32 32 Study: State Report Cards Need Big Improvements in Tracking COVID Learning Loss /article/new-study-finds-state-report-cards-rate-a-big-needs-improvement/ Thu, 05 Sep 2024 11:01:00 +0000 /?post_type=article&p=732400 Most people who know me would probably say I鈥檓 a data and accountability advocate. I鈥檓 on the and I鈥檝e written extensively (and ) about the role of accountability in promoting educational improvement. But I鈥檝e also been of accountability, especially so-called public accountability organized around the idea that parents and advocates will use data on key student outcomes to pressure schools to improve. 

When I partnered with the Center on Reinventing Public Education on a reviewing how transparent state report cards are in reflecting COVID-19 learning loss and recovery, I came in with an open mind. I expected they would contain most of the information we sought and would mostly be pretty usable. I was wrong. I think everyone on our team was incredibly disappointed by many of the state report card websites and their inability to answer our primary questions of interest about the effects of COVID on student outcomes.


Get stories like this delivered straight to your inbox. Sign up for 蜜桃影视 Newsletter


Here are four questions from our five analysts about these sites, based on direct quotes from a written interview we all completed after we finished rating the report cards, that we think states should consider moving forward. 

Where Is the Data?

The high-level takeaway from our report: It is extremely difficult on most state report card websites to track longitudinal performance data at the school level going back to before COVID. There are a few exceptions 鈥 seven states (Connecticut, Delaware, Hawaii, Michigan, Oklahoma, Pennsylvania and Tennessee) earned an A for having this data available.

But even in many of these better-performing states, there were problems. Many state report cards make it difficult to do things that should be easy. Parents should be able to use the report cards to compare schools they are considering for their children, but in too many places, that is impossible. Advocates should be able to understand, at minimum, the performance of federally mandated student groups, such as children with disabilities and English learners, but many states completely bury these data. Further, report cards often lack other kinds of data that parents might want about available services, like advanced coursework, counseling, even sports and the arts. Overall, the reviewers were disappointed and disheartened.   

Are There Really No Best Practices?

We were struck by the variation across the 50 states and the District of Columbia. One reviewer commented, 鈥淚t was as if 51 different contractors designed these report cards without so much as a single best practice about how they鈥檙e supposed to look or function.鈥 Some states leaned on graphs, others on tables. Some websites were easy to navigate, while others were befuddling. Some made subgroup data easy to find; others made it nearly impossible. Some report card websites couldn鈥檛 even easily be found through a Google search.

Our analysts also noted the difficulty of simply figuring out the basics of each site. 鈥淚 was surprised with how different each state report card was and the amount of time it took to familiarize myself with it enough to find the data I was looking for,鈥 one wrote. I felt this acutely as I examined all 51 report cards. It sometimes took two or three 10- to 15-minute visits to feel like I understood the layout of some of the sites. 

Overall, we felt that there surely must be some in reporting these kinds of data that states could draw on to improve their report cards. We all wanted easily navigable sites (i.e., that made it clear where to click to find what you wanted) where 1) measures were described in clear language and organized thematically, and 2) users could manipulate the data to answer their most important questions. No site met this bar, though some, such as Idaho, Illinois, Indiana, New Mexico and Oklahoma, were far better than others; Alaska, Louisiana, New York and Vermont

were among 11 states that earned the lowest grade for usability. There could be real value in researchers working with organizations like the Council of Chief State School Officers to lay out some explicit design principles. 

Who Is the Intended User?

State report cards are intended principally for parents. Realtors certainly think parents care about school quality; otherwise, they wouldn鈥檛 name local elementary schools in their listings. The popularity of sites like proves that at least some demand for school performance data exists. However, if parents are the main intended audience for these reports, it sure doesn鈥檛 seem that way. 鈥淚 could see [parents] spending considerably more time on this compared to our research team,鈥 said one of our researchers. Another described the situation for parents as 鈥渇rustrating and disempowering,鈥 echoing what the Data Quality Campaign found last year when it asked . 

We felt that the report cards were perhaps trying to serve too many audiences and, in the end, not serving any very well. States need to think clearly about whom they鈥檙e serving and redesign their report cards from the ground up, working with those groups to ensure usability. In particular, the language of the report cards needs to be clear for people who may not be experts in accountability terminology and education-related acronyms. Even with our levels of expertise, we were sometimes unclear about what different data points meant. 

Are State Reports Doomed to be a Compliance Exercise? 

A few reviewers thought some state report cards seem like a compliance exercise: States post them because the federal government requires them to, but, ultimately, they鈥檙e not concerned about whether these websites are usable. This is a somewhat cynical take, but it鈥檚 hard not to feel that way after reviewing some of these sites. 

But even if report card sites did start as compliance exercises, they can still serve a positive function in the long run. We don鈥檛 want to be Pollyannaish about their potential, but parents clearly care about the effectiveness of the schools they choose for their children, and states clearly can do better at communicating schools鈥 effectiveness.

We hope this review is a wake-up call for states to consider better reporting of school performance data. While private companies, like GreatSchools, can provide alternatives, states are missing an opportunity to shape parents鈥 thinking about what matters for school effectiveness, and why. The failure of states to provide high-quality, usable report cards raises a fifth question: Given the importance of effective public education and the apparent need and demand for the data, how can states justify doing such a lousy job at informing parents?

]]>
In Push to Renew School Accountability, Feds Urge States to Keep Eye on Pandemic /article/in-push-to-renew-school-accountability-feds-urge-states-to-keep-eye-on-pandemics-effects/ Tue, 04 Jan 2022 21:47:53 +0000 /?post_type=article&p=582905 Following a two-year pause, states must resume the process of pinpointing their lowest-performing schools and those with persistent achievement gaps, according to a recent draft of guidance from the U.S. Department of Education.

But bowing to uncertainty sparked by the pandemic, officials will allow one-year changes to the criteria states use to identify those schools. That means the report cards states use to communicate student performance to the public could look quite different.


Get stories like this delivered straight to your inbox. Sign up for 蜜桃影视 Newsletter


To help measure COVID-19鈥檚 impact, states might also choose to rate schools on how much instructional time students lost or break out chronic absenteeism by whether students were attending school in person or remotely. The department will collect comments on the 31-page document until Jan. 17.

鈥淭his gives a clear signal to the field and to the states that we are restarting accountability,鈥 said Jennifer Bell-Ellwanger, president and CEO of the Data Quality Campaign. 鈥淭his data needs to be reported to families and the community.鈥

The nonprofit is among the organizations that have been calling for more statewide data on student performance during the pandemic 鈥 even though standardized tests were canceled in 2019-20 and several states saw low turnout for testing last school year. Others say the department, by allowing such a vast array of changes, could leave parents and the public more in the dark about how well schools have performed.

The department is recommending that states and districts update improvement plans to focus on the pandemic鈥檚 effects on the most vulnerable students. States can also give those schools more time to improve by not counting the 2019-20 and 2020-21 school years, and can change the achievement targets they need to hit to be removed from the state鈥檚 lowest-performing list.

鈥淯ncle Sam is saying that not only is it okay to move the goalposts, states can install new goalposts if they want to, too,鈥 said Dale Chu, a senior visiting fellow with the conservative Thomas B. Fordham Institute who helped implement Indiana鈥檚 accountability system.

In addition to the temporary changes, the document encourages states to consider long-term additions, such as adding new indicators of student success that could endure beyond 2022.

鈥楤ehind-the-scenes tinkering鈥

Under the federal Every Student Succeeds Act, states must test students in reading, math and science and publicly identify their lowest-performing Title I schools and those where groups of students, such as English learners or students with disabilities, consistently underperform. Those schools, which receive extra funds to help students make progress, have up to four years to show improvement or face additional state intervention.

Maria Cammack, deputy superintendent of assessment, accountability, data systems and research at the Oklahoma State Department of Education, said state officials aren鈥檛 talking about adding new measures of school quality for one year, but want to be as transparent as possible about data elements that can supplement its high-stakes accountability system.

鈥淓verybody wants to understand unfinished learning,鈥 she said, adding that it can take a while for districts to report and interpret new information. 鈥淎ny changes enacted for a single year breaks our ability to monitor change in performance in a time where we need to understand it most deeply.鈥

Bibb Hubbard, president of the nonprofit Learning Heroes, said she appreciated the department鈥檚 expectation that states include families in making decisions about changes to accountability. Parents, she said, rely on state report cards to understand their children鈥檚 progress in school and 鈥渨ant the truth, even if it isn’t good news.鈥 States, she added, should research which measures parents find most meaningful.

Chu added that it could be hard for the public to keep up with 鈥渁ll of the behind-the-scenes tinkering.鈥

鈥淚f states add, modify [or] remove a bunch of indicators from their state report cards,鈥 he said, 鈥渋t will be extremely difficult, if not impossible, to get an honest accounting of how schools and students have fared.鈥

A key question for state leaders has been how to calculate whether schools have improved over the past few years in the absence of consecutive years of assessment data. Most states consider test score trends over multiple years as part of their accountability systems.

The guidance suggests states could replace the growth measure with a different indicator 鈥 like achievement gaps 鈥 but experts say such a change could significantly alter which schools are identified for improvement.

Growth is currently 鈥渂y far the best鈥 measure for differentiating between schools, said Cory Koedel, an economics and public policy professor at the University of Missouri鈥揅olumbia and an expert on growth measures. 鈥淚 can鈥檛 even name what a plausible back-up plan would be,鈥 he said.

Chris Janzer, the assistant director of accountability at the Michigan Department of Education, added that there鈥檚 no guarantee state testing will run smoothly this year.

鈥淲e don鈥檛 know what test participation is going to look like this coming spring, especially with Omicron raging now,鈥 he said. 鈥淲ill we have another wave in the spring that causes more school disruptions?鈥

For 2021 testing, the department waived the requirement that states assess 95 percent of their students. But if schools fall short of that percentage in 2022, it鈥檚 possible they would be identified as low-performing based on participation rates alone, said Janzer, whose state saw 70 percent participation last year.

Oklahoma had an overall participation rate of over 90 percent last spring, Cammack said, but in some districts, only about 30 percent of students took state tests, despite an assessment window that was three weeks longer than normal and included extended hours and Saturday sessions.

鈥楳eet the moment鈥

Stanford University scholar Linda Darling-Hammond, who serves as president of the California State Board of Education, acknowledged that 2022 probably won鈥檛 be a 鈥渘eat and tidy year鈥 in the realm of testing and accountability.

But she is among those who see the guidance as a way to 鈥渓ay a path toward reauthorization鈥 of ESSA. The department, she said, is sending the message that accountability is important, but that states should also 鈥渕eet the moment鈥 and consider changes that allow more room for other measures of student achievement and school performance.

Bell-Ellwanger, with the Data Quality Campaign, said the guidance presents an opportunity to add criteria that some say is lacking from many state report cards 鈥 such as more data on what students do after high school.

In December, her organization and Chiefs for Change, a network of district leaders, issued a report arguing that K-12 leaders could better prepare students for college and the workplace if data on college enrollment, jobs and other postsecondary trends were more accessible.

鈥淎s states signal that they are moving forward with recovery,鈥 she said, 鈥渦nderstanding college and career pathways and the economic mobility of students is important.鈥

The department鈥檚 guidance notes that states could also consider adding 鈥渙pportunity to learn鈥 standards 鈥 such as the extent to which students have access to qualified educators and a high-quality curriculum. A recent report from FutureEd, a think tank at Georgetown University, highlighted growing efforts to rate schools on questions of equity, which could range from whether students have access to advanced courses or even if schools have Black and Hispanic mental health providers on staff.

But the report noted that some measures might not be statistically valid and reliable enough for an accountability system that determines consequences for schools.

鈥淭hey need to be predictive,鈥 said Thomas Toch, the director of FutureEd and co-author of the report. 鈥淭hey need to confidently signal how students are likely to perform in school and beyond.鈥

But he added that just reporting data on some of those goals is still useful for the public even if they aren鈥檛 used to identify schools for accountability. The department鈥檚 guidance takes this 鈥渃autious stance,鈥 he said. 鈥淭ransparency has the power to focus educators鈥 and others鈥 efforts, even when they don鈥檛 face direct consequences for the information that鈥檚 collected.鈥

]]>
Many 2019-20 State Report Cards Lacked Chronic Absence, Graduation Data /test-results-werent-the-only-data-missing-from-state-report-cards-last-year-review-shows-many-lacked-absenteeism-grad-rate-info/ Wed, 26 May 2021 04:01:52 +0000 /?p=572492 Get essential education news and commentary delivered straight to your inbox. Sign up here for 蜜桃影视鈥檚 daily newsletter.

States didn鈥檛 have student performance data to report from the 2019-20 school year because tests were cancelled. But many also left the public in the dark about how many days of school pupils missed or which students were less likely to graduate, according to on state report cards.

Nineteen states either failed to break down graduation rates by race and ethnicity or didn鈥檛 report rates for groups such as special needs, low-income or homeless students, the Data Quality Campaign鈥檚 review shows.

鈥淲hen we talk about the drop in students enrolling in community college, or any postsecondary option, all of those data points start with high school graduation,鈥 said Paige Kowalski, executive vice president of the nonprofit organization. It鈥檚 important, she added, to have evidence of the pandemic鈥檚 higher toll on low-income and minority students. 鈥淲e just had so little data about what was happening in schools.鈥

The federally mandated report cards are the primary way states make complicated school-level education data accessible to parents and other members of the public. Each year, the nonprofit releases which states it thinks are doing a good job of making the information easy to find and understand. Leaders say despite the pandemic鈥檚 disruption in testing 鈥 and waivers from the federal government dropping accountability requirements 鈥 states could have used their report cards to give families more insight into why some data is missing or to report on students鈥 access to at-home internet access. But most didn鈥檛.

鈥淲hy do we continue to rely on Congress to tell us what we need to know?鈥 Kowalski asked. 鈥淪tates did not use this as an opportunity to provide more information. They chose not to use it to shine a light.鈥

According to 鈥淪how Me the Data,鈥 just nine states posted the percentage of students missing at least 10 percent of school last year. Another 26 reported chronic absence data from previous years or didn鈥檛 specify the year.

Hedy Chang, director of Attendance Works, a research and advocacy organization, said she suspects many state officials felt their 2019-20 data wouldn鈥檛 be accurate once students shifted to remote learning or that it wouldn鈥檛 be comparable to previous years.

鈥淒istricts were really confused about whether to take attendance or not and how,鈥 she said, but added that states could have added an explanation rather than not release it at all.

Almost half of the states didn鈥檛 include all of the data required on educators, such as those teaching out of their field or those with emergency credentials. And many states still aren鈥檛 reporting results for at least one group of students, which Kowalski said will be important in the future to track which groups of students at each school were most affected by the pandemic and distance learning.

For example, even if they collect it, 13 states still don鈥檛 break down data by gender on report cards. Kowalski said teens in general have gone to work or picked up more responsibility at home to help families facing economic hardship. But, she added, it鈥檚 not 鈥渁 stretch鈥 to assume older girls assumed more additional household duties than boys and cared for younger siblings so their mothers could stay in the workforce.

Some states provide links on their report card websites that lead to additional information, but Kowalski said those looking for data shouldn鈥檛 have to hunt for it.

鈥淭o ask anyone, let alone a parent, to dig through and Google multiple websites and cobble together a full, robust picture of what happened in a school is absurd,鈥 she said.

鈥楢 financial footprint鈥

A few states tried to put the results in context instead of cutting back. Pennsylvania and Iowa used their annual report cards to be upfront about what was missing 鈥 and why 鈥 or point readers to waivers from the federal government showing which data wasn鈥檛 required.

And North Dakota posted the number and percentages of students in virtual, hybrid or in-person learning, long before the U.S. Department of Education started its own tracker a year after the pandemic began.

Ross Roemmich, the North Dakota education department鈥檚 director of management information, said all districts in the state use the same technology platform 鈥 Powerschool 鈥 making the data collection easier.

鈥淲hen COVID hit, everything changed,鈥 said Roemmich. 鈥淲e knew that someday, the federal government would probably ask us which schools were face-to-face, hybrid or remote.鈥

Kowalski noted that just because states couldn鈥檛 hold schools accountable for results didn鈥檛 excuse them from reporting other data mandated by law 鈥 including per-student spending for each school. Several states made progress on that requirement, which went into effect last year. Thirty-six states reported the spending data for 2019-20, up from 19 the previous year.

Last year鈥檚 report cards likely won鈥檛 provide a lot of information on how states are directing federal relief funds. But Marguerite Roza, director of the Edunomics Lab at Georgetown University, said going forward, the per-pupil spending data will help the public track whether districts direct funds toward schools serving the neediest students. 鈥淭here will be a financial footprint to this,鈥 she said.

Kowalski said she鈥檇 like to see all states add information on students鈥 access to devices and Wi-Fi to their report cards.

鈥淲e already knew devices were important,鈥 she said, but added that during the pandemic, a computer and a reliable connection 鈥渂ecame school.鈥

]]>