assessments – Ӱ America's Education News Source Tue, 17 Feb 2026 17:16:53 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 /wp-content/uploads/2022/05/cropped-74_favicon-32x32.png assessments – Ӱ 32 32 NYC Parents Want Career Aptitude Assessments for All High Schoolers /article/nyc-parents-want-career-aptitude-assessments-for-all-high-schoolers/ Wed, 18 Feb 2026 19:30:00 +0000 /?post_type=article&p=1028601 This article was originally published in

As New York City schools ramp up their focus on job readiness programs, a parent board overseeing high schools is calling on the Education Department to implement career aptitude assessments for all ninth and 11th graders.

“It helps with the ever popular question of ‘What do you want to be when you grow up?’” said Lawrence Lee, one of the sponsors . “It’s a big world with lots of different options and choices. I think many people look around and think their choices are only what they can see around them.”

, like other schools across the state and nation, are increasingly focusing on career education. There are more than 130 career and technical schools plus over 260 career and technical programs offering internships, apprenticeships, and job-focused courses across the five boroughs. But often, students are left to navigate a complicated application process without guidance on how various programs, electives, internships, career and technical tracks, and postsecondary paths might align with long-term goals, the high school council board members said. They believe the career aptitude assessments can help students reflect on their choices to improve how they select courses and work toward real-world goals.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


“By 11th grade, those decisions directly affect college applications, workforce credentials, and financial planning. Rather than leave those moments to chance, these assessments can give students the agency to better understand their own talents and to see multiple futures for themselves,” said Deborah Alexander, one of the resolution’s sponsors.

Education Department officials said they will review the resolution, but added they currently use platforms that offer interactive career exploration activities and generate tailored career options based on students’ interests.

“This career planning is also embedded in 1:1 advising, ensuring each high schooler receives personalized support in mapping out their next steps,” Education Department spokesperson Isla Gething said in a statement.

The high school council members want students to take “developmentally appropriate, research‐based” assessments in the fall of freshman year and spring of junior year, saying it will help provide more guidance especially for students from historically underserved communities and those learning English as a new language.

“Some students grow up surrounded by professionals who talk openly about their work and pathways, but many do not,” Alexander said. “That difference can shape who sees themselves as an engineer, a nurse, a filmmaker, an entrepreneur, or who never considers those possibilities at all.”

The online career assessment industry has exploded in recent years: An across the country use off-the-shelf advising tools from more than 20 companies, and many others use custom tech tools.

Some research suggests that career aptitude tools can help students better understand their strengths, that might otherwise not have been on their radar. Some experts suggest the tech tools can also help erode , when it comes to career advice.

But evidence of how effective these tools are remains scarce, which is why education research organization MDRC has embarked on a long-term analysis of two of the tech tools, expecting to release results in the summer. Though the tools offer schools a way to advise students without having to hire more counselors — doing deep dives into what kinds of careers fit a student’s aptitudes and personality as well as what kind of degree to pursue and potential salary ranges — they often need, said Rachel Rosen, a senior research associate at MDRC.

“They’re not perfect,” Rosen said of the tools. “They are better if there is a teacher or an adult who will take the information and really work closely with the students on understanding how it can help them think creatively about what the tools are saying.”

While MDRC researchers don’t yet have definitive answers on whether the tool helped reduce bias, they did find that by the time students take the assessments, they already have some of their own assumptions about who they are and what kinds of careers they might do, Rosen said.

“They felt like they knew themselves better than the tool,” she said, and while the tools still had potential, “they need some good adult guidance to go with them.”

Chalkbeat is a nonprofit news site covering educational change in public schools. This story was originally published by Chalkbeat. Sign up for their newsletters at .

]]>
California Changed the Way it Teaches Science. But Test Scores Remain Low /article/california-changed-the-way-it-teaches-science-but-test-scores-remain-low/ Wed, 10 Sep 2025 18:30:00 +0000 /?post_type=article&p=1020523 This article was originally published in

A decade ago, California schools introduced a new K-12 science curriculum that was hands-on, interactive and designed to prepare students for the challenges of the 21st century. 

But since the state started testing students on the new Next Generation Science Standards in 2019, the first time ever California assessed students in science, test scores have barely budged, with stark gaps among some groups of students.

“In large part, science has not been viewed as a priority. It’s been moved to the back burner,” said Jessica Sawko, education director at the research and advocacy organization Children Now, and former head of the state’s association of science teachers. “But science needs to be a priority. How will we prepare our kids to make sense of the world around them?”

In 2019, three years after most schools began teaching the new science curriculum, only 30% of students met the standard on the state exam. Last year, the number had inched up to only 30.7%.

Wide gaps exist among student groups. Among students whose parents graduated from college, 42% met the standard, compared to 17% of those whose parents never went beyond high school. Fewer than than 21% of low-income students met the standard. Only 15% of Black students met the standard, compared to 61% of Asian students.

Delays and obstacles

There’s a few reasons for the stagnant scores, experts said. Pandemic school closures set achievement back significantly for all subjects, but it especially affected science because so much of the new science curriculum centers on hands-on projects, which were nearly impossible to conduct over Zoom.

And after the pandemic, schools focused their recovery efforts on literacy, math and attendance, the most glaring challenges as students returned to in-person learning. Chronic absenteeism, for example, soared from 10% pre-pandemic to 30% in 2022. 

Another reason for the low science scores is accountability, Sawko and others said. For the first few years of the new science test, the scores were not posted on the state’s — the primary means of publicizing students’ academic performance. The rationale is that the test was new and the state was still working out the kinks.

Last year, the results were posted at the bottom of the Dashboard in an area marked “informational purposes.” Unlike the other features of the dashboard, such as math and English language arts scores, science was not color coded to indicate the performance level of individual schools or student groups. The science results were solid gray.

When the new scores are released this fall, science will be color-coded on the Dashboard, but science still falls short of full accountability, advocates said. Low-performing schools won’t be singled out by the state for extra assistance, although that might change next year.

Another obstacle has been teacher training. After California adopted the new standards, it didn’t invest any money in professional development until 2023. For many years, districts used their own funds or found private grants to pay for teacher training, but by fall 2020 at least 30%-40% of teachers had received no training in the new standards, according to a by the California Association of Science Educators. Teachers at low-income and rural schools received the least training.

In 2023 the state allotted $85 million to improve math, science and computer science education, but only about $1.5 million went to train teachers in science. The rest went to train teachers in math and computer science – which also recently got new standards – and to host family STEM nights and other activities. The money went to county offices of education to distribute locally. 

The grant expires in 2027, and it’s crucial that the state continue that investment, said Shari Staub, co-leader of the California Math, Science and Computer Science Partnership.

“We are daily faced with public health challenges, climate challenges, equity challenges — all the things a scientifically literate population should be able to address, not just for California but for the world,” Staub said. “If we’re not investing in science, we are not preparing students for the world they are entering.”

Three-dimensional learning

The Next Generation Science Standards were created in 2011 by an education nonprofit called Achieve, with help from 26 states and dozens of science education experts. The idea was to make science more engaging and “three-dimensional,” as the authors put it, by combining concepts from multiple scientific disciplines so students could discover patterns and systems. Students would gain critical thinking skills and a solid understanding of scientific concepts, largely by doing hands-on projects rather than listening to lectures.

Many school districts in California have embraced the new standards and seen scores improve. In fact, California public schools — particularly those in tech hubs — have some of the top science programs in the country. California students routinely win the National Science Bowl, Science Olympiad and other national competitions. 

For the most part, those districts invested their own funds early in the rollout to train their teachers. And they have strong support from parents, financial and otherwise. That amounts to PTA funds that teachers can use to pay for science field trips or extra help in the classroom, plenty of parent volunteers and an overall expectation that science education is a priority. 

None of the top-performing schools were Title I low-income schools, but they weren’t all homogenous affluent schools, either. Some had 25% or more low-income students, large percentages of English learners and diverse student populations. They might have PTA support, but they don’t receive much extra money from the state because they don’t have large numbers of high-needs students.

La Cañada Unified near Pasadena, for example, received only $13,700 per student last year from the state, about $5,000 less than the state average. But more than 77% of students met or exceeded the science standards last year, some of the highest scores in the state. 

Each elementary school in the district has a science lab and an aide to assist with science projects. A summer camp called “STEM-nauts” pairs older students with younger ones for science-themed games and experiments. The high school offers five Advanced Placement science classes and a host of science-related extracurricular activities, including an astronomy club, neuroscience club and chemistry club. Students can do internships at the NASA Jet Propulsion Laboratory, which is a quarter-mile from the high school.

“In our district, the science kids are the cool kids,” said James Cartnal, assistant superintendent. “Science is part of the culture here. We work intentionally and very hard to make it that way.”

‘Think like scientists’

At Lawson Middle School in Cupertino, science is nearly everyone’s favorite subject. The science classrooms are boisterous places with students conducting experiments and trying to figure out solutions. The shelves are well stocked with beakers, scales and microscopes. Colorful tapestries of the periodic table hang from the ceiling. Anime renditions of the elements — including xenon, helium, germanium, cadmium — adorn the walls.

One recent afternoon, students in Emily Adams’ eighth grade science class did a lesson on measurements. Adams started by asking them why accurate measurements are important. Their answers: so astronauts know how much fuel is left in their rocketship; so truck drivers know if their vehicle will fit under an overpass; and so doctors know how much medicine they’re giving a patient.

Then they worked in groups to measure various objects, using an infrared thermometer, an electronic scale and other tools.

“This class is fun. I like all the labs, figuring out how things work in the real world,” said student Neil Dhaman. “P.E. is my favorite class, but this is second.”

Adams said the class was typical, in that she spends about 10 minutes explaining a few main concepts and the students spend the rest of the class on projects related to the concepts. “I want them to focus on skills and critical thinking, not just regurgitate facts,” Adams said. “I want them to think like scientists.”

Cupertino is in the heart of Silicon Valley, home to the Apple computer headquarters and dozens of tech start-ups. Google and Facebook are a few miles away. Despite the lure of six-figure salaries in Silicon Valley, Cupertino Union School District has very little turnover among science teachers, a key reason the science scores are so high, said Marie Crawford, the district’s director of instructional leadership and intervention. 

“The teachers know each other, work together, help each other out,” Crawford said. “It makes a big difference.”

Like La Cañada, Cupertino Union School District does not receive a lot of money from the state. Last year, the state provided $16,400 per student, far below the state average.

In teacher Maryhien Pham’s class, eighth grader Aanya Dhar and her classmates demonstrated how to find the mass of a marble by dropping it into a cylinder of water, and weighing the cylinder before and after. The answer: 3 milliliters. 

“I might want to be a scientist when I grow up,” Dhar said. “I like learning about new things, experimenting, getting to know how things work.”

This article was and was republished under the license.

]]>
Texas Students Make Gains in Reading but Struggle with Math, STAAR Scores Show /article/texas-students-make-gains-in-reading-but-struggle-with-math-staar-scores-show/ Sun, 22 Jun 2025 16:30:00 +0000 /?post_type=article&p=1017183 This article was originally published in

Texas’ students saw some wins in reading but continued to struggle to bounce back from pandemic-related learning losses in math, state testing results released Tuesday showed.

Elementary students who took the State of Texas Assessments of Academic Readiness exam this year made the biggest gains in reading across grade levels. Third graders saw a three percentage point increase in reading, a milestone because early literacy is a strong indicator of future academic success. Progress among middle students in the subject, meanwhile, slowed.

“These results are encouraging and reflect the impact of the strategic supports we’ve implemented in recent years,” said Texas Education Agency Commissioner Mike Morath. “We are seeing meaningful signs of academic recovery and progress.”

This year’s third grade test takers have benefited from state investments in early literacy in recent years. Teachers in their classrooms have completed state-led training in early literacy instruction, known as reading academies. The state also expanded pre-K access and enrollment in 2019.

Morath did acknowledge students needed more help to make similar gains in math. Five years after pandemic-related school closures, students are still struggling to catch up in that subject, the results showed. About 43% of students met grade-level standards for math, a 2 percentage point increase from the previous year, but still shy of the 50% reached in 2019.

Low performance in math can effectively shut students out of high-paying, in-demand STEM careers. Economic leaders have been sounding the alarm about the implications that weak math skills can have on the state’s future workforce pipeline.

The STAAR exam tests all Texas public school students in third through eighth grade in math and reading. A science test is also administered for fifth and eighth graders, as well as a social studies test for eighth graders. Science performance improved among fifth and eighth grades by 3 and 4 percentage points respectively, but students in those grades are still below where they were before the pandemic.

Students in special education also made small gains. English learners, meanwhile, saw drops in all subjects but one — a 4% decrease in reading, a 2% decrease in math, and a 2% decrease in social studies.

The test scores give families a snapshot of how Texas students are learning. School accountability ratings — which the Texas Education Agency gives out to each district and campus on an A through F scale as a score for their performance — are also largely based on how students do on the standardized tests.

The test often casts a shadow over classrooms at the end of the year, with teachers across the state saying they lose weeks of valuable instructional time preparing children to take the test. Some parents also because of . They have said their kids because of the enormous pressure the hours-long, end-of-year test puts on them.

A bill that would have scrapped the STAAR test died in the last days of the 2025 legislative session. Both Republican and Democratic legislators expressed a desire to overhaul STAAR, but in the end, the House and Senate could not align on what they wanted out of an alternative test.

Legislators this session did approve a sweeping school finance package that included academic intervention for students who are struggling before they first take their STAAR test in third grade. The package also requires teachers get training in math instruction, mirroring existing literacy training mandates.

Parents can look up their students’ test results .

Graphics by Edison Wu

This article originally appeared in , a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.

]]>
Opinion: Advice for Districts: Don’t Give More Tests — Give the Right Tests /article/advice-for-districts-dont-give-more-tests-give-the-right-tests/ Wed, 16 Apr 2025 12:30:00 +0000 /?post_type=article&p=1013720 Educators are buried under a mountain of tests. While state-mandated exams often take the blame, the real culprit is the growing pile of district-mandated assessments layered on top of school-administered exams. School system leaders hear the same concern again and again: Teachers spend too much time administering assessments that, while often adopted with best intentions, don’t provide enough value. 

Through our work with school districts such as Madison, Wisconsin, and Syracuse, New York, and states including Indiana and Louisiana, and have had a front-row seat to the challenges and opportunities in assessment strategy. We’ve seen what works, what doesn’t and what it takes to design a system that serves students and teachers. Too few districts actually know what they are trying to accomplish with all the tests they administer. 

Districts should consider three issues in addressing assessment overload:

  • Test volume: Especially in grades K-8, teachers spend too much time preparing for and administering tests, while students lose precious classroom hours — as many as 100 per year — taking redundant exams instead of engaging in meaningful learning. Excessive testing exhausts students and frustrates teachers without always giving them what they need most: insights they can use to improve learning.
  • Usefulness of test reports: Most district-mandated assessments are off-the-shelf products that deliver results quickly but not necessarily usefully. Districts, teachers and families rely on these tests in good faith, only to receive data that compare students to one another (think percentiles) rather than to the grade-level standards they need to master.
  • Incoherence: To boost student outcomes, districts often add tests without retiring others. Leaders of various central office departments — special education, literacy, multilingual learning and the like — procure their own exams, without coordinating to consider “two for one” opportunities. The result is a tangled mess of assessments that overlap, confuse and overwhelm. In some districts, we’ve seen as many as 15 assessments in play, with each serving a different purpose. 

Although the problem is layered, the solution is straightforward: Districts need fewer, more instructionally useful assessments. A strategic approach can transform how schools measure progress, decrease costs and stress, and help students and teachers focus on what matters: learning. 

In our organizations’ work helping states and school systems use more effective assessments, we’ve seen district leaders make great decisions that resulted in more streamlined exams. (Together, we’ve published a to guide other districts through a similar process.) We recommend that every district take these four actions:

Build a unified leadership team. Districts must bridge internal divisions among departments. A strong assessment redesign team should involve curriculum leaders, testing experts and specialists in multilingual and special education (at minimum) to establish the purpose and guiding principles for assessment planning, asking how exams contribute to and and help measure progress toward achieving the district’s broader vision for learning.

Audit and streamline tests. Districts must scrutinize every exam: What is its purpose? Does it deliver insights that educators can use to plan their next moves with students? Which truly help teachers teach, and which are just filling up time? By focusing on fewer but higher-quality assessments, districts can reclaim valuable instructional time and ensure that every test adds value for teachers and students. ANet’s assessment audit across Louisiana revealed that seventh-graders were losing up to 22 instructional days per year due to a bloated assessment system. Post-audit, 15 Louisiana districts reclaimed an average of five days of school per year.

Engage educators in the redesign. Teachers bring a critical perspective to assessment selection and use. Districts should bring educators into the process early and often, seeking their insights on which exams work, which don’t and how testing can be improved. In Syracuse, the district’s leadership team convened a committee of teachers and principals who reviewed the nearly 70 local assessments for K-8. With this educator input, the district eliminated many duplicative assessments and clarified the purpose and use of data from others. 

Communicate the new approach. If educators understand why certain tests were removed and which remain, they’ll get on board, and when teachers are invested, students benefit. We recommend first cultivating the support of a team of influential educators and community leaders. In our work with districts across multiple states, there was a clear trend: Districts that engaged parents and teachers early — explaining the “why” behind changes — saw higher buy-in and smoother implementation​. After a well-communicated assessment redesign process in Madison, 97% of school leaders supported the district’s vision for the role of assessments, up from 44%. 

Exams don’t have to be a burden. By committing to fewer, more purposeful assessments, districts can lighten the load on educators and sharpen their focus on student outcomes systemwide. We’ve seen districts successfully transform their approach to assessment and witnessed the pain points in districts that have not yet done this critical work. 

The solution isn’t more tests, it’s the right tests. That’s how to give teachers the insights they need and students the learning they deserve. 

]]>
Opinion: Whatever Changes the Feds Make, They Must Keep Requiring Annual State Exams /article/whatever-changes-the-feds-make-they-must-keep-requiring-annual-state-exams/ Tue, 04 Mar 2025 17:30:00 +0000 /?post_type=article&p=1010961 Recent national and international assessments demonstrate that American student achievement is in steep decline. 

Results from the 2024 (NAEP) showed that only a third of students are reading at grade level. On the International Mathematics and Science Study (TIMSS), an international assessment of math skills in 64 countries, American math achievement dropped 18 points for fourth graders and 27 points for eighth graders between 2019 and 2024. In both grades, American students were outperformed by peers in China, Japan, Singapore, South Korea and many European nations. 

Lawmakers need to take action to drastically improve student outcomes, and President Donald Trump’s promises to put parents in the driver’s seat and ensure states are in control of their education policymaking could be good steps in that direction. But a few federal K-12 education policies are mission-critical and should remain in place to fuel this effort. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


One is the federal requirement that all states administer annual tests that measure learning for every student in third to eighth grades and once in high school. This critical backstop protects states from powerful special-interest groups seeking to eliminate the transparent information about student achievement that state tests provide.       

Massachusetts’ November election results demonstrate the power of these groups. The Massachusetts Teachers’ Association reportedly contributed over $7 million to the campaign to eliminate the Massachusetts Comprehensive Assessment System (MCAS), which measures 10th graders’ knowledge of English, math and science as a graduation requirement. 

Although most students pass the MCAS on their first attempt, the union pointed to the achievement gap among groups of students surfaced by test results as a reason to eliminate it. In November, voters approved a ballot measure to eliminate the MCAS as a graduation requirement, effectively weakening high school diplomas for all students in the commonwealth.

Unlike report cards and observations, which are subjective, statewide assessments are the only source of objective and comparable information about student performance. These exams provide policymakers and the American public with important insights on America’s readiness as a nation to meet the challenges and opportunities of the 21st century. 

These assessments also supply parents with transparent information about how well their child is being served. State tests provide apples-to-apples comparisons about the performance of a school relative to others – information that is essential for enabling families to make informed decisions about what’s best for their children.

Arguments against testing often focus on the ways in which educators respond to assessments by narrowing the curriculum, but those issues point to a lack of instructional leadership, which is not resolved by eliminating a test. Others complain about the inability of annual state tests to provide timely data to help inform day-to-day instruction. While very important, this is not the purpose of yearly assessments. Rather, a continuum of tests, including benchmarking exams and daily knowledge checks, ought to be used to inform school- and classroom-level instruction.

Finally, there are those who simply don’t like the results of the assessments and seek to eliminate them rather than using them to ensure learning for all students. This is a little like blaming a thermometer for a fever. As a nation, America cannot afford to hide from the truth. The nation’s education system needs to improve, and assessments are the way to measure progress. 

Without statewide assessments, parents, educators and policymakers lose access to clear, comparable information about student performance. This will not prepare children better; it will hurt them. It will not empower parents to make informed choices about their children’s education, but rather obscure critical information. The federal requirement for states to administer annual assessments provides important cover against special interests’ efforts to eliminate transparency.

Now more than ever, all students should have access to an education that will prepare them for the 21st century. As the Trump administration works to connect the dots among education, the workforce and the economy, it can empower state leaders and parents by continuing the federal requirement for statewide annual assessments. This federal role is the best way to protect systems from special interest groups and ensure policymakers, parents and the American public have the clear, transparent, meaningful data they need about how well students are learning.

]]>
Opinion: Reading Tests Are Out of Step with Reality. There’s a Better Way. /article/reading-tests-are-out-of-step-with-reality-theres-a-better-way/ Tue, 25 Feb 2025 13:30:00 +0000 /?post_type=article&p=740448 American teachers and students are captives of a broken assessment system. 

Interim reading assessments frustrate teachers and students and devalue what students are learning, even though they’re intended to provide useful information about student progress and help teachers target instruction throughout the year. They have not moved the needle on reading proficiency or reducing inequities, as new confirm.

Today, we’re issuing a clarion call to assessment stakeholders at all levels: Do better for teachers, so they can do better for students.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


Right now, periodic reading tests prompt students to “find the main idea” or identify a “point of view” — discrete standards and skills that don’t add up to reading comprehension. They are misaligned with the on how kids learn to read well and ignore the foundational role of knowledge in reading comprehension. Reading is a meaning-making endeavor, and comprehension is an outcome that occurs when readers apply a dynamic set of reading processes and knowledge to a text.

But that’s not what we’re measuring. Consider this fourth-grade Reading Standard 3 for literature: 

Describe in depth a character, setting, or event in a story or drama, drawing on specific details in the text (e.g., a character’s thoughts, words, or actions).

Students could miss a test item tied to this standard because of weak decoding skills, insufficient vocabulary, difficulties parsing syntax or transitions, or insufficient background knowledge. Often, it’s a combination of these factors, not misunderstanding the standard itself, that contributes to a wrong answer. But the interim assessments we give students today can’t identify what went wrong.

Reporting test results by standards, strategies, genre or any single construct confuses cause and effect. Answering a question based on a standard is an effect of comprehension, not a cause. And a student’s response to any one question tied to a standard does not predict how well that student will do on a similar question using a different text.

It’s time to transform. Few schools — or teachers — will move to text-focused classrooms and abandon using standards as the organizing force for daily lessons if the assessments they’re provided use an outdated, ineffective approach. It’s a vicious and damaging cycle. There’s a better way.

Transforming Assessment Questions and Classroom Conversations 

We need new assessments that reflect the research base and diagnose the degree to which actual reading comprehension is occurring. 

Assessments should focus students on the most challenging sections of a text and pose questions that can determine whether students navigated the passage for meaning. Questions also should address what world knowledge can be learned from reading the text carefully. And, questions should focus on challenging vocabulary or phrases to see if students understand the contributions that vocabulary makes to meaning. Only then should tests feature standards-based questions that fit the text to determine if students’ comprehension reflects the depth and complexity called for by the standards. (For an example of this approach, see the Case Study .) 

Such assessments would provide more meaningful information and play a more powerful role in the classroom. Rather than issuing reports on mastery of this or that standard, assessment developers need to release their passages and items in full, along with guidance on how to discuss the results with students. Then teachers could use interim assessments to deconstruct student thinking in class, by revisiting reading assessment texts and asking students to share their thoughts, passage by passage, about each question they encountered and explain why they answered questions as they did. 

This is a low-tech, labor-intensive, and high-impact way to use interim data to inform instruction. We learn from our mistakes, and in the case of comprehension questions, the richest discoveries will come not from asking which items students missed, but by asking why. Students can go astray for a variety of reasons, and the best way to identify the path they followed, or where comprehension broke down, is to ask them what they were thinking. The challenges any text presents will vary, but the number and types of obstacles are not infinite. As obstacles are revealed, teachers — and eventually, students — can lead discussions that explore how best to overcome them. This collaborative approach enhances comprehension for all students, expanding their understanding by recognizing how ideas, language, and vocabulary interact with knowledge to make meaning. 

Deconstructing assessments with students connects instruction directly to the science of reading comprehension rather than treating reading as a disjointed series of atomized elements. Teachers might find that what they are already doing to support students’ reading comprehension is on the right track, but they need to do more of it, or some areas require less attention. Over time, teachers and students will recognize the nature of the various obstacles that complex text presents and how these can be addressed. In other words, assessments can do what is intended of them: inform instruction. 

Teachers face a learning curve, and these candid, text-driven conversations take time to do well. However, it is hard to imagine a more powerful way for teachers to support students in learning about texts, probing their thinking, tackling common challenges, deepening comprehension, and exploring the suite of constructs known as literacy. 

Contextualizing Assessments Is Key

An even more enduring and essential reform is to ensure tests actually measure what students are learning. Better interim reading assessments, then, would not only reflect the science of reading comprehension but they also would be based in curriculum and connected to the books and topics students study in class. 

This vision rejects the false premise that reading comprehension is a content-neutral skill that can be taught and tested in the abstract. Rather than asking students to address items tied to random passages they may not know anything about, a contextualized approach to reading assessment would offer a multidimensional view of students’ reading comprehension. It would be more fair, authentic and equitable, and would more accurately mirror the literacy tasks students will encounter after graduation.

It’s time to invest genuine energy and resources into creating interim assessments that provide actionable insights and align with research and the real world. Current assessments are standards-specific and knowledge-agnostic — the inverse of what research and experience tell us teachers and students need. This approach is a closed loop that is steering teachers and students off-course. 

Rather than assess frequently, study the error patterns in data meetings, map those errors onto matching discrete skills or standards, isolate those standards, and instruct teachers to repurpose reading into a relentless repeating pattern of practicing said standards — interim assessments, whether created by assessment providers or curriculum publishers, simply must focus on the real and varied causes of breakdowns in comprehension.  

Developers need to revamp their tests to tackle the challenges inherent in content-rich text. They need to abandon the practice of reporting by state standards, strategies or any other atomized element. They need to release items that allow teachers and students to thoroughly analyze and comprehend what students are learning. 

Designing the right tests will empower and incentivize the right teaching and make reading tests genuinely valuable to educators and students. The responsibility and power rests with interim assessment providers and publishers, as well as the state and local leaders who procure them. Test developers, hear our call: We need an interim assessment do-over.

Susan Pimentel is co-founder of StandardsWork, a nonprofit education consultancy that sponsors the Knowledge Matters Campaign. She was the lead author of the Common Core State Standards for English/language arts literacy and led development of the Knowledge Matters Review Tool. 

David Liben has worked with schools and districts nationwide to improve student learning for over 20 years.  He is the former principal of a high-performing school in Harlem and is the co-author of two highly acclaimed books on reading.

]]>
Are Students Gaining Ground in Math and Reading? Not Very Much … /article/are-students-gaining-ground-in-math-and-reading-not-very-much/ Tue, 10 Dec 2024 15:30:00 +0000 /?post_type=article&p=736715 How did U.S. students fare academically last year? 

There are three different sources of information to answer that question. Two of them are showing students made no or small gains last year, and the third, NAEP, will come out in early 2025 and provide the final word. 

The first results were the interim benchmark assessments like NWEA’s MAP Growth and Curriculum Associates’ i-Ready. Combined, they test millions of students several times a year, so think of them as the canary in the coal mine. Although they found slightly different trends across subjects and grade levels, they that students made little progress in math and may have even declined in English Language Arts. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


The interim assessments are voluntary, and they don’t break out the results by state, district or school. So the next piece of evidence comes from the tests that states administer each Spring, and those results have been slowly trickling out. Now, the team behind has organized that data, and as of the end of November, they had grade- and subject-level results for 39 states and the District of Columbia. 

The states are painting a slightly more optimistic picture than what the interim assessments showed, but just barely. For example, the median state reported a one-point increase in the percentage of 8th graders who were proficient in math. States reported similarly small gains across grades and subjects, with the exception of 8thgrade English Language Arts, which declined by 0.2 points. 

To put it bluntly, these small gains are not enough to get kids back up to their achievement levels prior to the pandemic. And, with ESSER funds expiring earlier this year, there’s not a lot of fuel left to help students get back on track. 

The table below shows the state-level results in 8th grade math. Readers should take those with a grain of salt. For example, Oklahoma and reported double-digit increases, but those are largely due to leaders in those states lowering standards. 

You can also see some missing data in the table. Some states haven’t released their results by grade level, as they are required to by federal law. And as Dale Chu noted in the , 10 states are out of compliance with federal law with respect to how scores are reported, and 13 are not reporting what percentage of students actually took the tests. 

Some states have been putting up modest gains for the past few years. In 8th grade math, for example, 10 states—Alabama, Connecticut, Kansas, Massachusetts, Mississippi, Montana, New Hampshire, North Carolina, Rhode Island, and Virginia—have all increased proficiency rates by more than 1 point a year for multiple years in a row. Other states have shown little to no progress from their pre-pandemic lows, notably Arizona, Colorado, Kentucky, Louisiana, Maryland, Michigan, Minnesota, Oregon, South Carolina, Texas, Wyoming, and the District of Columbia. 

To know for certain which of these gains are real, and which ones are artificially inflated, we’ll have to see the third set of data, the NAEP results that are scheduled to come out early next year. Given that they use one common yardstick across the country, those should provide the final verdict on these early recovery years. Judging by what we’ve seen from the first two sources, we shouldn’t hope for much more than a very slight uptick nationally. 

Disclosure: Chad Aldeman works with NWEA and the Collaborative for Student Success. 

]]>
Texas Children Still Struggle in Math Post-Pandemic, Schools Try New Approaches /article/texas-children-are-still-struggling-with-math-after-the-pandemic-some-schools-are-trying-a-new-approach/ Wed, 14 Feb 2024 18:01:00 +0000 /?post_type=article&p=722195 This article was originally published in

DALLAS — In Eran McGowan’s math class, students try to teach each other.

If a student is brave enough to share how they solved a math problem, they stand up in front of the other third graders and say, “All eyes on me.” The classroom responds, “All eyes on you,” and the student explains how they did it.

This collaborative method of learning math is part of a new curriculum, named , that was launched in the Dallas Independent School District this school year. It emphasizes helping students better grasp mathematical concepts instead of their performance on the state’s standardized test. The new curriculum is described as a step away from memorization.

The new curriculum “moves away from using tests as a way to measure success,” said McGowan, who teaches at the Eddie Bernice Johnson STEM Academy. “It’s more focused on the kids understanding the concept, and in turn, that will help a child pass assessments.”


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


While the teaching approach is different, the intent ultimately continues to be helping students do better on the math portion of the State of Texas Assessments of Academic Readiness. Last summer’s showed that Texas students have still not caught up to the math scores they had in 2019, before the COVID-19 pandemic hit. Forty-five percent of students who took math in third through eighth grade or Algebra I last year passed the STAAR test. While their math scores represent a slight increase from last year, they are still 7 percentage points behind the state average in 2019.

What’s more, the number of students who went above and beyond and “mastered” the subject has not recovered since the pandemic. In 2023, 19% of all Texas students mastered math at their grade level, down from 26% in 2019. While Texas students’ overall math scores last year were four points higher than the national average, the percentage of students who master math in the state is significantly behind the national average of 38%, according to the Nation’s Report Card, which samples fourth- and eighth-grade students’ reading and math grades across the country.

Policymakers and educators worry that the low number of students who master math will mean not enough Texans will have the skills to meet the demands of the most lucrative, in-demand jobs in the next few decades. They fear Texas will not be able to produce its own workforce and will be forced to look for talent elsewhere. According to a Stanford University , students who do not bring their math scores back up to pre-pandemic levels will earn 5.6% less over the course of their lives than students with better grades just before the pandemic hit.

“Is our inability to get kids back towards this increased level of mastery — for math — going to limit them in the long run for the types of jobs that you’re going to be able to access, or even feel like they can access, in the future?” said Gabe Grantham, a K-12 policy analyst at Texas 2036, a public policy think tank. “If we don’t do anything about this at the state level in 2025, we’re going to be behind the ball.”

Texas won’t know how well Eureka Math is working until later in the year, when the next STAAR results are released, but there is optimism. About 400 other Texas school districts, both private and public, are using the curriculum. Across the country, districts that have the curriculum have seen scores . Dallas ISD the program at Anson Jones Elementary before adopting it districtwide and found that students’ math scores and confidence in their handling of the subject went up.

The Texas Legislature has also taken steps to make it easier for students to advance in their math studies. Lawmakers last year passed , which automatically promotes middle schoolers to a higher math class if they do well at a lower level.

The law’s author, state Sen. , R-Conroe, said having students perform at a high level in math will increase their lifetime earnings and contribute to a healthy Texas economy. Lawmakers, policy analysts and public education officials are looking for other ways to help students bring up their math scores ahead of the 2025 legislative session, he said.

Grantham said Texas is behind other states when it comes to math reform at the legislative level, but it’s better to design policies based on data and a careful review of what’s working and what’s not.

“We don’t want to throw things at the wall and see what sticks,” he said. “Everyone wants the same silver bullet, but we’re trying to parse out what that actually looks like.”

For now, Texas is betting on laws passed over the last couple of years to help struggling students, such as mandated tutoring and, more recently, a law that makes it easier for teachers and districts to have access to “high-quality” instructional materials. Texas education experts and school administrators believe both policies are promising, though they say staffing shortages have made it difficult to comply with mandatory tutoring.

Teaching challenges

When the pandemic forced Texas schools to close and shift to virtual learning, STAAR scores plummeted to lows not seen in a decade.

Schools and families weren’t ready for the change. Some children didn’t have internet access or computers at home; others were completely absent. Academic achievement in both reading and math took a hit.

Four years later, reading scores have surpassed pre-pandemic levels but students are still struggling with math.

“The pandemic was just such a large-scale interruption, one that our system didn’t really know how to engage with,” said Carlos Nicolas Gómez, an assistant professor of STEM Education at UT-Austin. “And due to that, even coming back, we’re still dealing with the interruption.”

Gómez and Grantham said the reason why students have recovered faster in reading is because they can practice it at home much easier than math.

“Reading, it’s a lot easier for parents to read to their kids at home,” Grantham said. “Math is going to take a lot more direct instruction. That was just lost when kids were out of school.”

When kids came back to the classroom, many didn’t have a grasp of mathematical concepts they should’ve learned in previous years, said Umoja Turner, principal of the Eddie Bernice Johnson STEM Academy.

It fell on teachers to come up with learning plans that incorporated the concepts students are supposed to learn at each grade level, plus fill out the gaps in learning caused by the pandemic.

But Michelle Rinehart, superintendent of the Alpine Independent School District, said the state’s teacher shortage crisis and the departure of experienced teachers from schools have made it difficult to help students catch up. Only two out of her seven math teachers in grades 3-8 have taught math before, she said.

Experienced teachers lead to increased student achievement, according to the , an education policy think tank. But during the last school year, 28% of new teachers hired in Texas did not have a certification or permit to teach, and 13% of all teachers left the profession. Both figures represented historic highs.

“That is a really high challenge right now,” Rinehart said.

The teaching shortage is especially hard for rural districts compared to their urban counterparts. For starters, Rinehart said, small districts like Alpine can’t pay teachers as much and usually have far fewer resources.

A new way to learn

Before Eureka Math was introduced in Dallas and Alpine ISDs, teachers could use a variety of different curricula, mostly geared toward passing the STAAR and memorizing how to solve equations.

This led to differences in how students across the state learned math. Turner said this sometimes causes students who move to a different campus to struggle when adapting to a new teaching method.

With Eureka Math now being widely adopted across Dallas ISD, students have a more consistent way of learning math, which hopefully will result in better test scores, he said.

McGowan said the curriculum he used in the past heavily emphasized passing the STAAR.

“With previous curriculums, it was just, ‘we have an equation, we solve it,’ but the kids cannot explain the process well,” he said.

Brittany duPont with Great Minds, the company that designed Eureka Math, has been helping Dallas teachers adopt the new curriculum. She said it’s been a huge shift in math teaching, and some veteran teachers have pushed back.

But duPont said the teaching tactics that Eureka Math proposes are needed to help kids catch up with their math studies after the pandemic. They’re also timely because the recently redesigned STAAR test now focuses more on how a child solves a math problem, she added.

Kids are more excited to learn and master concepts with Eureka Math, McGowan said. Another upside of the new curriculum is that it gives teachers room to test kids’ knowledge on a topic before each lesson, making it easier for teachers to collaborate on ways to help students catch up, he said.

The new curriculum also emphasizes collaboration. McGowan lets his students debate concepts with each other and figure out how they got to certain conclusions. The process allows them to gain a deeper understanding of mathematics.

Moving to a new curriculum always poses a bit of a risk and challenge, especially when it’s easier to stick to what you know, but McGowan said he’s seen kids enjoy learning math in a way he never has in his 18-year career.

“It’s about trusting the process. Trusting that the kids will learn,” he said. “But we have to be consistent.”

This article originally appeared in at .

The Texas Tribune is a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.

]]>
Cardona Is Inviting States to Create Innovative Exams. 4 Ways They Can Start /article/cardona-is-inviting-states-to-create-innovative-exams-4-ways-they-can-start/ Mon, 05 Feb 2024 12:01:00 +0000 /?post_type=article&p=721539 When the head of the federal authority that compels states to administer standardized tests that those exams have not always met the mark and invites states to create a system that is more useful to students, families and educators, state education leaders must seize the moment before it slips away. 

In a November , Secretary of Education Miguel Cardona encouraged chief state school officers to rethink state assessment programs and offered guidance on how to do it using the Innovative Assessment Demonstration Authority. This program, tucked into the federal Elementary and Secondary Education Act, allows approved states to pilot new assessment approaches and to scale them statewide over time.  

Although the authority has been around for nearly a decade, many chiefs have shrugged it off as irrelevant — having too many requirements, affording too little time and providing no additional funding for exploring new methods of assessment. Only a few states have bothered to apply; of those, some were not selected, and two that were felt and dropped out. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


But in Cardona’s letter, we see a fresh federal desire to support states with assessment innovation, through the authority or otherwise. And that matters if states are to finally create to measure learning and report progress that is relevant and meaningful — not just to federal policymakers, but also to educators, students and families. 

Two things stood out to us. In addition to clarifying flexibilities in some of the technical requirements, the letter encourages chiefs to enter planning status with the department before submitting a formal application, and to pursue funding through other federal programs.

Combined with other indications, like the recent permitting Montana to pilot a program that administers smaller tests throughout the year instead of the current end-of-year assessment, the department is signaling that it wants to make assessment reform more feasible. 

How might state leaders seize the moment? We suggest four steps:

First, take the pulse of impacted communities.

State leaders should begin by understanding how people at different levels of the education system see the benefits and drawbacks of current assessment and accountability methods. It’s important to know who favors change (and who doesn’t) and to address concerns early on. 

Chiefs could do this through familiar methods like listening tours and town hall meetings, or go a step further to create design teams of students, parents, community advocates, educators and technical experts. Smart partnerships with organizations that convene and build the skills of district leaders, school leaders and educators; research institutions; research institutions; and leaders from the governor’s office, legislature and state Board of Education can also support assessment design and implementation.

Kentucky is one state that has successfully pursued this approach. After leading a Commissioner’s Listening Tour, Kentucky partnered with the Center for Innovation in Education to launch a tasked with co-creating a new vision for education. 

Engaging many parties to collaboratively design new assessment and accountability models helps build public and political will for change. People begin to support a new system when they see their needs and concerns represented in it — or, at least, when they feel listened to and understand the rationale for inevitable compromises.

Second, start a dialogue with the U.S. Department of Education.

Once state leaders have engaged design collaborators, they should reach out to the department to start a dialogue about their ideas. Then, they should make a formal request to enter planning status. Cardona’s letter clarifies that states can do this even if their vision for innovative assessments — and with it, their formal intention to apply to the authority — is still emerging. We see planning status as a low-stakes arrangement that states can request without having to complete a full proposal.

In this way, states can receive feedback on their nascent assessment designs. And, while non-binding, planning status can confer some formality to a state’s intentions, which can help garner support and funding back home.

Third, states should leverage other federal programs for funding.

Cardona’s letter suggests that states don’t have to fund assessment innovation entirely on their own; instead, it invites leaders to consider other federal funding sources, particularly the Competitive Grants for State Assessments program. Kentucky is one state that’s using program funds received in 2022 to design a new model for school and district accountability based on what it learns from districts that are piloting competency-based assessments of learning. The new state system that emerges may become codified in a future application. 

Other federal grant programs, such as school improvement funds in Title I, may be even more useful in supporting local engagement in assessment innovation, as this money could be used in pre-planning and preparing to apply to the grant program.

Fourth, they should seek federal flexibility.

It’s true that states can layer new tests on top of federally mandated assessments without needing federal approval, or just charge ahead and ask forgiveness later. But we believe there’s now a more viable path toward having conversations about innovation out in the open. That’s how states can create a single that generates information useful for state-level oversight while adding value to teaching and learning in the classroom.

State education leaders should move quickly, if they haven’t already started. They need to hit the ground running and start engaging communities across the state, gathering eager innovators, listening to myriad perspectives and learning from one another. Windows of opportunity can open and close as supporters move in and out of positions of influence, but a groundswell of local demand is hard to ignore. 

We have seen how bold, sustained leadership that is informed and supported by changemakers on the ground can convince federal authorities to give something new a try. New Hampshire proved that in 2015 with its for the Performance Assessment of Competency Education pilot, and we think the department is even more open-minded today. 

One thing is certain: State education leaders can’t stand still. They must heed the department’s strong signals – and put them to the test.

]]>
America’s Cratering Math Scores Spark Call to Action from Education Experts /article/watch-education-experts-issue-call-to-action-about-americas-cratering-math-scores/ Thu, 01 Feb 2024 16:30:13 +0000 /?post_type=article&p=721477 The numbers are beyond discouraging. According to the latest international PISA report, math scores among American students fell 13 points between 2018 and 2022, the equivalent of two-thirds of a year of learning. 

Only 7% of U.S. students can do advanced math, and affluence is no guarantee of student performance.

These disappointing stats will be examined in the next online panel presented by the Progressive Policy Institute and Ӱ at 1 p.m. ET Thursday. Panelists will put the PISA outcomes into perspective and offer answers to the inevitable, “Now what?” moment of reckoning.

The speakers include Dr. Peggy G. Carr, commissioner of the U.S. Department of Education’s National Center for Education Statistics; Andreas Schleicher, Director of the Directorate of Education and Skills at the Organization for Economic Cooperation and Development; and Jonathan A. Supovitz, professor at the University of Pennsylvania’s Graduate School of Education.

Go Deeper: Explore more coverage surrounding America’s math crisis: 

]]>
How AI Can Help Create Assessments that Enhance Opportunities for all Students /article/how-ai-can-help-create-assessments-that-enhance-opportunities-for-all-students/ Tue, 23 Jan 2024 12:00:00 +0000 /?post_type=article&p=720799 Like so many aspects of K-12 education, including classroom instruction, assessments of student learning are experiencing some titanic shifts. Two of the biggest factors driving these changes are the advancement of artificial intelligence tools and a growing commitment to the development of exams that improve opportunities for all students.

Developers are increasingly leveraging AI in assessment design, development, scoring and reporting. The implications include potential improvements that give real-time and increase instructional efficiency. But there are also potential threats, such as algorithmic , so-called responses and increased that could weaken privacy protections. 

Of course, advances in AI are not the only factor influencing the future of assessments. in educational opportunity are widespread, and professionals increasingly that the use of tests for purposes ranging from college admissions to school accountability has largely failed to mitigate them. In response to this failure, exam developers, policymakers, community leaders and educators have for tools, practices and policies designed with the goal of enhancing opportunities for all learners.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


These two trends offer a framework for a new approach that capitalizes on the promise of AI in ways that could benefit all students. We propose that such a paradigm should incorporate five key features.

  • An emphasis on a whole-child, integrated view of learning and assessment. The , based on decades of research, points to the integrated nature of academic, social and emotional development. AI-enhanced tools could emphasize this in several ways, such as by supporting the measurement of collaborative problem-solving skills or building digital measures of .
  • A broader perspective on personalization. The phrases “” and “” often emphasize adjusting instruction or exam content in response to student achievement and interests. As developers enact AI-driven personalization of assessment, they should explore opportunities to tailor assessment tasks not only to students’ prior achievement and interests, but also to their linguistic, social and cultural backgrounds.
  • Reconsideration of how schools define and prioritize outcomes. AI is capable of that have traditionally been carried out by humans. What, then, does it mean to demonstrate proficiency in writing when nearly everyone has a chatbot in their pocket? What kinds of media literacy and critical thinking skills do people need to navigate this changing landscape? To succeed in the modern workforce and flourish as adults, students will need to build proficiency across , and schools will need to figure out how to teach and assess them.
  • A revised concept of test security. Along similar lines, concerns about how tools like ChatGPT might enable students to cheat are widespread. A learner-centered approach to assessment should acknowledge ways in which technology is advancing and what it means to be proficient in affected areas, such as research and writing. This approach should also consider how to incorporate AI tools into assessment tasks, rather than treating them as threats to the accuracy of resulting test scores.
  • Prioritization of human relationships. documents the value of supportive relationships and a sense of belonging in schools, and on the role of AI in education have emphasized the need to . This advice applies equally to assessment: Despite the potential improvements to quality and efficiency stemming from automation of test development, scoring and reporting, human involvement in the process can provide valuable opportunities for connection and collaborative learning. Additionally, digital measures of engagement, collaboration and other aspects of student development provide only partial information and should be supplemented with educator and peer input.

The integration of AI into educational assessments that are learner-centered will bring potential benefits and pitfalls. For instance, new tests that incorporate a could generate useful evidence to inform instruction, but they could also result in inappropriate inferences about students’ capabilities or raise concerns among parents or others with objections to the . Similarly, on personalized learning makes it clear that state and local policies, along with supports for teachers such as professional development, will need to be aligned with the goal of personalization.


Achieving the vision of a learner-centered assessment system that leverages the best of modern technology will require a collaborative approach that involves research and development teams, policymakers, educators and, perhaps most importantly, the young people who have the greatest stake in how this work evolves. All these groups must keep their collective emphasis on the ultimate goal — measuring what truly contributes to the holistic development of each student while ensuring that the of educators and learners remain at the center.

]]>
Alaska Board of Education Lowers Test Score Standards Due to Nationally High Bar /article/alaska-board-of-education-lowers-test-score-standards-due-to-nationally-high-bar/ Mon, 22 Jan 2024 17:30:00 +0000 /?post_type=article&p=720727 This article was originally published in

The Alaska board of education approved lowering the test score standard for student proficiency, after school leaders cited the state’s nationally high bar.

Student success on standardized tests is categorized by what are known as cut scores, which are the range of results that show indicate a score is above or below proficiency for a grade level.

Alaska’s standards for proficiency have been among the highest in the nation, and some educators and officials have said that the state has set the bar, or the cut score, too high in some areas.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


At its regularly scheduled meeting on Wednesday, the school board approved a series of adjustments to those standards for the Alaska System of Academic Readiness, known as AK STAR tests, which was proposed by the Department of Education and Early Development. It also updated regulations for administering assessments to students with disabilities.

DEED Commissioner Deena Bishop said the new cut scores are a better reflection of the kind of growth that is possible for typical students to achieve in the months between assessments. She said the adjustments may lower the expectations for proficiency, but that does not mean Alaska’s standards are now low.

“We’re still in the top third of all states in the nation for expectations and high standards,” she said. “We’re just not at the top anymore.”

Some members of the public were critical of the changes, and said the state should be supporting teachers and students rather than lowering expectations.

Timothy Doran, a former educator and administrator who now serves on the Fairbanks North Star School District Board, said he wants to see the state review its assessment standards before it changes cut scores, but added that he appreciated DEED’s process.

“We’re setting a cut score based against a standard which is 10 years old and have not been reviewed for whether they’re appropriate,” he said. “We’re applying it to a test for which we have not looked to say, ‘What’s going on here? Are students understanding these questions? Have we set that bar so high that students can’t get over it?’”

Haines Borough School District Superintendent Roy Getchell praised the department for its efforts. He served on the policy review for the regulation change.

“Assessments in Alaska around the country have had too many setbacks, stops and starts that have really kind of eroded the confidence of our processes, which is why it was critical that we get it right out of the chute. And I’m much more confident that what’s being presented today is going to be right from the start,” he said.

Lisa Parady, who has a doctorate in education leadership and is executive director of the Alaska Council of School Administrators, said policy reviews like this one are a normal process, and that the state has seen a lot of assessment changes over the years.

“We’ve seen a lot of changes, and now we’re on a good path,” she said. “It’s incumbent upon every one of us to make sure that what we put out is accurate and right and aligned, so that our teachers can get what they need in terms of the results of this assessment.”

Alaska’s STAR test results were delayed this year because of the change to cut scores, the department said.

is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Alaska Beacon maintains editorial independence. Contact Editor Andrew Kitchenman for questions: info@alaskabeacon.com. Follow Alaska Beacon on and .

]]>
Why 20 Missouri School Districts Are Seeking New ‘Innovation Waivers’ to Rethink the Way They Test Students /article/why-20-missouri-school-districts-are-seeking-new-innovation-waivers-to-rethink-the-way-they-test-students/ Mon, 14 Aug 2023 11:01:00 +0000 /?post_type=article&p=713166 Updated: The Missouri State Board of Education voted unanimously Aug. 15 to approve ‘innovation waivers’ for the 20-school Success-Ready Students Network.

A network of 20 Missouri school districts is asking the state to implement a more responsive assessment system in order to personalize student learning.

The state Board of Education is considering the districts’ proposal to change testing at its Aug. 15 meeting. If approved, it would be the inception of a shift in Missouri’s education system that will “resurrect student engagement,” district leaders say.

The group of schools, part of the , want to move away from the state’s annual standardized testing to assessments that would be administered multiple times a year. The coalition consists of public school districts and one St. Louis charter school, and includes a mix of rural and urban campuses with a wide range of student performance scores and poverty rates, according to state demographic and . 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


During a June state board meeting, district leaders argued that doesn’t provide results in time to be effectively used in the classroom. 

The schools want to instead take advantage of a new pilot waiver program created last year that offers exemptions for districts to bypass specific education laws for up to three years. These “innovation waivers” are intended to boost student performance and benefit educators by giving schools the room to implement unique strategies, said Lisa Sireno, assistant commissioner with the Missouri Department of Elementary and Secondary Education. 

“The state legislature enacted a statute that allowed the school innovation waivers in 2022 and so we’ve been working on what that process might look like,” Sireno told Ӱ. “The group with our very first innovation waiver request — the Success-Ready Students Network — kind of grew out of a (state) work group that was looking at competency-based education.”

While 20 school districts in the Success-Ready Students Network have agreed to launch new assessments if approved, other schools will join in the future, said Mike Fulton, one of the network’s facilitators. The plan is for a new cohort of districts to use the innovation waivers each school year until the entire state is involved.

Mike Fulton

If approved, districts will be able to administer multiple interim tests, but will still have to give the normal annual standardized test until a federal waiver is approved to get rid of it. Fulton said the Success-Ready Students Network will be working on a federal waiver later this year.

Fulton said the state’s innovation waivers are key to, which allows students to move through education at their own pace as they demonstrate a full understanding of the material.

“The whole proposal is designed to support the participating districts in using personalized, competency-based approaches in their learning design,” Fulton told Ӱ. “The assessment system was designed to provide feedback to both students, teachers, parents and every stakeholder, on how individual students are progressing, how classrooms and schools are doing and how districts are doing as a whole.”

Jenny Ulrich, superintendent of the Lonedell School District, part of the Success-Ready Students Network, said her teachers are always asking for feedback on what they are doing in the classroom, but assessment results are returned too late to make an effective change for individual students.

Jenny Ulrich

“We are alone out there trying to figure out how we get real-world learning to our kids,” Ulrich told the state board in June. “This work supports educators. It gives them a platform, an opportunity and the data they need to make good instructional design and decisions for their kids.”

Besides lagging results, around the U.S. for sucking up too much time, being culturally biased and doing little to improve students’ academic outcomes.

Ulrich said instead of the one-time tests, schools will administer tests several times a year and keep results updated online on a district dashboard for teachers to use in real time. The dashboards, which will go live in November, will show a student’s progress in becoming “high school ready” or “college, career and workforce ready.”

“By the end of the 2025-26 school year, it is our aim — our lofty goal — that 100% of our graduates would have an individualized plan,” Ulrich said. “As we reach these goals, all students will be able to declare, ‘I am truly college, career and workplace ready.’”

Fulton said districts will be transitioning to competency-based learning even if the state innovation waivers aren’t approved. Students will progress on evidence of mastery of skills based on state standards, meaning they might move through the K-12 education system faster or slower than their peers.

“That scares people a bit and I understand that,” Fulton said. “That’s a big shift.”

Sireno, the assistant state education commissioner, said the desire to switch Missouri schools to competency-based learning emerged from the learning loss caused by the pandemic. Earlier this year, more than a 100 Missouri districts experienced a drop in their student assessment scores to levels that would typically threaten their state accreditation.

“This will allow students to move at the appropriate pace. So, if some students finish mastery of the content a little bit quicker, if some students take a little bit longer, that’s OK,” Sireno said. “It’s a heavy lift, but it’s important work, and (districts) realize that it can have a real positive impact on student learning.”

Other schools around the nation have been tackling competency-based education as a way to help students recover ground in learning. Idaho, South Carolina, Kansas and Utah are among those that have successfully created competency-based learning systems, according to a .

Some states haven’t done as well implementing competency-based education. In 2018, Maine’s Department of Education had to model several years after it went into effect. The system lacked specifics in things like proficiency and grading, which also sparked parent backlash.

This is a common failure in putting the approach into practice, according to the Missouri education department’s  

“Researchers attribute negative outcomes to schools that implemented (competency-based learning) without clear definitions and expectations, as well as uneven implementation,” the report says. 

When Missouri’s innovation waiver plan was unveiled in June, the entire State Board of Education voiced support for it.

“It is a gift to the students, the parents and families in Missouri, and I would say nationwide,” said Charles Shields, board president. “Others will learn from us nationwide.”

Vice President Carol Hallquist said she believed it will “change the face of education” in Missouri.

Fulton, of the Success-Ready Students Network, said he hasn’t heard from any stakeholders warning against the use of innovation waivers or the switch to competency-based learning, but there is some wariness from the state department about using a model that hasn’t been tested. 

“I think we’re all going at this cautiously. Research is going to sit at the core of this,” he said. “But you have to be willing to be entrepreneurial and innovative and that’s what I think these districts are being asked to do. We need more of that in public education.”

]]>
How Good Are the Tests Teachers Give Their Students? Districts Need to Know /article/how-good-are-the-tests-teachers-give-their-students-districts-need-to-know/ Mon, 12 Jun 2023 18:01:00 +0000 /?post_type=article&p=710303 At this critical juncture in K-12 education, it’s essential that schools invest in tools to better identify students’ learning needs so they can address pandemic recovery and . But while most districts use commercial interim assessments to guide them, far too little is known about the effectiveness of these tests.

Interim assessments are big business. The term covers a of designs and purposes, but broadly, these are exams administered at different points in the school year to gauge student progress. Usage is widespread, with the heaviest reliance in — those that serve the most marginalized and vulnerable students. Many educators make instructional changes based on the results, decisions that can have profound and lasting effects on the trajectories of countless learners.

According to the RAND Corporation, reported that their students had taken an interim assessment in the 2021-22 school year, and demand in this is growing. But while states’ end-of-year exams are thoroughly peer-reviewed, no such process exists for interim assessments. Further, publishers share very little evidence to show that their products are standards-aligned or can improve student learning. For educators, this means interim assessments are a black box, with no third-party reviews of publishers’ marketing claims.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


This was the very problem our organizations — , a nonprofit providing free reviews of instructional materials, and the (Center for Assessment), an organization focused on improving the quality of educational assessment and accountability systems — sought to solve when we our plan to review commercial interim assessment products last year. 

Unlike EdReports’ of K-12 instructional materials, for which products can be purchased independently, access to interim assessments requires publisher consent, because their test questions, reports and other tools are proprietary. Most publishers declined our invitation to participate in our new reviews. Two did agree, but then one pulled out. It simply wouldn’t have been meaningful to release a single review without context, so we had to bring the process to a halt. 

Particularly in the current moment, with districts making high-stakes instructional and budgetary decisions to try to accelerate post-COVID student learning, publicly available, independent reviews of interim assessments could have been a powerful resource. The impossibility of moving our reviews forward should be cause for concern. But by sharing what we’ve learned, we hope to inspire educators to demand greater transparency from publishers. Even without independent reviews, there’s a lot that districts can do to become critical consumers before purchasing interim assessments.

First, determine their needs:

  • What are their and overall goals for student learning in the relevant content area, and what should students therefore experience on a daily, weekly and monthly basis?
  • What will assessments look like over the course of the school year? How will they with other instructional components to help educators understand and improve student learning?
  • Based on the above, what do districts need in a commercial assessment product? What specific gap should it fill? If the district already has high-quality instructional materials, to what extent do their meet those needs?

Districts that do need a commercial product should get clear on what they want before looking at options:

  • What is their main goal for the product? Do they want to evaluate school or district-level trends or help educators understand student progress in a specific learning area? While a publisher may claim that a product can do both these things equally well, in practice, that’s very challenging to achieve.
  • What questions does the district expect the product to help answer, and what information is needed to answer those questions?
  • How will the product meet the needs of its primary user? If it’s for teachers, how will the district know if it provides accurate information that educators can use to help students? What professional learning will users need in order to use the product in conjunction with instructional materials to support student learning effectively?
  • How will the district know if the product is well-aligned to standards? What type of test questions should educators expect to see, and what evidence will confirm that the exams genuinely assess students’ understanding of the full depth of each standard? Districts should communicate their needs and ask for evidence. Equipped with a clear picture of their requirements, they can leverage their role as a current or potential customer to get the information and evidence they need.

Questions publishers should be able to answer include:

  • What are the intended uses of your product, and what research supports those uses?
  • How should assessment scores be interpreted, and what decisions can they inform? What evidence supports the idea that using the data in this way helps improve student outcomes?
  • How were the product’s test questions evaluated, and were educators involved?
  • Are all the test questions standards-aligned? If so, what evidence supports that claim?

In the absence of independent reviews, we encourage districts to take up the baton and exercise their purchasing power to press the assessment market for greater transparency. Students are counting on their teachers, administrators and educational leaders — they deserve evidence-based support to help them learn and grow.

]]>
Nation’s Declining Report Card Mirrors Drops in State Standardized Test Scores /article/state-standardized-tests-naep-scores-declines-comparisons-maps/ Tue, 25 Oct 2022 10:15:00 +0000 /?post_type=article&p=698676 This analysis originally

Updated Oct. 31

The recent release of scores from the National Assessment of Educational Progress (NAEP) provided a jarring reminder of the pandemic’s impact on academic achievement. The U.S. Department of Education’s portrait of student proficiency in math and English language arts in fourth and eighth grades found declines in every state between 2019 and 2022. In two thirds of states, proficiency rates dropped in both subjects and in both tested grades.

And students in states that re-opened schools quickly during the pandemic often performed no better than those in states that stuck with remote learning longer. Hardest hit were eighth-grade math proficiency rates, which fell 8 percentage points as the raw test score saw its biggest drop in the history of the national testing program.

Though it’s difficult to make precise comparisons between NAEP and state-level standardized test results, the NAEP trends largely mirror the findings of a FutureEd analysis of the testing trends of the 42 states that have released results from spring 2022 and have scores that can be compared to previous years.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


English Language Arts

Nationally, 33 percent of fourth graders scored at the proficient or advanced levels on the 2022 NAEP reading assessment, down 2 percentage points from 2019. The share of proficient students also fell two points at the eighth-grade level, from 33 to 31 percent. That comes at reading scores at both grade levels dropped 3 points, leaving them not significantly different from 1992. The 2022 assessment, administered between January and March, included nearly 450,000 fourth and eighth graders in more than 10,000 schools. It relies on a representative sample of students in all states and some large cities, while state testing is intended to capture results from all students.

On the NAEP, a proficient student demonstrates “solid academic performance and competency over challenging subject matter.” States typically determine their own standards for what is considered proficiency on their own testing and set a score to determine which students meet the mark. All but six of the 42 states that have released testing results from spring 2022 saw declines in overall proficiency rates from 2019, FutureEd’s analysis shows. A state’s overall proficiency rate includes English language arts scores from grades 3 through 8, as well as a high school test in some states. Seventeen states were within 5 percentage points of their 2019 overall rates. Another 14 dropped five or more points. And in five states, 2022 rates were a striking 10 or more points below their pre-pandemic levels.

On average, states’ English language arts proficiency rates declined 4 percentage points, with North Carolina seeing the largest drop, at 16 points. States did not administer standardized tests in spring 2020, and not all of them tested their students in spring 2021. Of those that did, most made up some ground in English language arts between 2021 and 2022, with an average gain of 1 percentage point. Texas students posted a 9 percentage point proficiency gain, the largest increase among states that have released their 2022 results.

In fourth and eighth grades — which are tested by the NAEP — state standardized test results showed that only four of the 42 states in the FutureEd analysis recorded higher English language arts proficiency scores at the fourth grade level in 2022 than in 2019: Texas, where the proficiency rate was up 11 percentage points, Alabama, up 6 points, and Tennessee, up 5 points. The steepest proficiency declines were in Massachusetts (14 points) and Delaware (13 points).

Eighth-grade English language arts scores had a few bright spots, with Alabama showing a 10 percentage point gain in proficiency, Iowa rising 6 points, Texas rising 3 points and five other states increasing proficiency by a point or two between 2019 and 2022. But the overwhelming majority of states lost ground, with North Carolina registering the largest decline, 15 points.

None of these states saw gains in proficiency on the fourth- and eighth-grade NAEP reading tests, and Tennessee’s rate actually fell by 5 percentage points in fourth grade. Delaware’s sharp decline continued in the NAEP with an 8-point drop in fourth-grade reading, as did North Carolina’s, with a 7-point drop in eighth grade.

The differences between the NAEP and state testing are not entirely surprising. NAEP sets a higher bar for proficiency than most state tests do. Some states actually lowered their cut scores for what qualifies as proficient in the past two years, and some states changed tests. Alabama, Arizona and Kentucky changed tests between 2019 and 2022 and offered cautions while providing comparisons across the years. Maine and New Mexico also changed tests, but the scores could not be compared. At the same time, state tests are often more closely aligned to state standards and what students learn in the classroom — meaning they may capture student achievement trends more accurately.

Math

The results in math were more discouraging, both on the NAEP and state tests. At the NAEP’s fourth-grade level, the rate of students scoring proficient or above fell from 41 percent in 2019 to 37 percent this year. Among eighth graders, the proficiency rate fell from 34% to about 26%. Both are significant drops that mirror unprecedented declines in raw test scores: a 5-point drop at the fourth-grade level and an 8-point decline in eighth grade.

Likewise, all state tests but one in the FutureEd study showed declines in overall math proficiency rates between 2019 and 2022. Eleven states were within five percentage points of their pre-pandemic performance, 22 dropped five or more points and in nine states proficiency rates were 10 or more points below their pre-pandemic levels.

The average drop in statewide proficiency in math was 8 percentage points. Alabama registered the greatest loss, with proficiency dropping 19 percentage points behind the 2019 level. Several other states, including South Dakota, Wyoming and Missouri, were only 3 percentage points behind their pre-pandemic levels by 2022. Mississippi managed to regain all the ground it had lost, the only state to do so in math across all tested grades.

Most states reversed some of their losses in math between 2021 and 2022, with an average increase of 3 percentage points. Virginia had one of the largest increases, at 12 percentage points, though it still lagged its 2019 proficiency level by 16 percentage points.

We found no states making gains in fourth-grade math and several states with steep declines on state tests: proficiency rates dropped 17 percentage points in Virginia, 16 points in Delaware and the District of Columbia, and 13 points in Alabama.

The only proficiency gains on state standardized tests at the eighth-grade level were in Georgia and Missouri, at 1 and 3 percentage points, respectively. Rates were flat in Mississippi. But they were down 24 percentage points in Alabama, the largest drop in the nation, 20 points in Virginia, 17 points in Texas, and 14 points in Ohio, Delaware and Washington state.

FutureEd research associates Benito Aranda-Comer and Nathalie Kirsch contributed to this analysis.

]]>
Closing the Data Gap for Indiana’s Littlest Learners, & a Model for Other States /article/closing-the-data-gap-for-indianas-littlest-learners-a-model-for-other-states/ Wed, 05 Oct 2022 14:00:00 +0000 /?post_type=article&p=697602 In the years since the pandemic forced K-12 schools to shift to remote education, student outcomes have suffered, from to . Schools are investing in a number of interventions to help students catch up, such as summer learning programs and high-dosage tutoring. But while these reactive efforts are necessary, it is equally critical to take proactive steps to prevent future learning declines.

Building a system where all students have the tools to succeed means starting with a strong foundation in early education. Research suggests that enrollment in high-quality early learning programs can have a on academic and life outcomes, influencing ,, among other factors. 

But despite increasing interest in developing high-quality pre-K programs, the lack of reliable and objective measures of children’s readiness for kindergarten leaves policymakers and state education leaders in the lurch. Without this information, it is difficult to understand how many students are entering kindergarten without the necessary prerequisites and, therefore, how large the need is for high-quality pre-K seats and where that need is the highest. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


In 2014, for instance, Indiana lawmakers mandated a longitudinal study about the impact of the new, state-funded On My Way Pre-K program. But officials and early childhood leaders determined there was no standardized method of measuring education outcomes among pre-K children statewide. There also wasn’t a single assessment that could be used to measure all desired outcomes, both academic and social-emotional.  

Our organizations, the Indianapolis-based and the Evansville, Indiana-based , saw an opportunity to address the data gap. In 2017, we contracted with to help identify and pilot a solution. Together, we supported the development of the Kindergarten Readiness Indicators, an assessment that provides a standard, objective set of measures for pre-K students about to enter kindergarten.

The indicators provide policymakers, early childhood center directors and teachers, advocacy groups, funders and others with the first-ever quantitative data on the proficiency of Indiana’s pre-K students in math, oral language and literacy – foundational skills that underpin academic success in elementary school and beyond. The assessment takes fewer than 20 minutes to administer and relies on direct measurement of skills, rather than teacher observation. The resulting data can be aggregated at the classroom, provider, local, regional and state levels, meaning it can provide valuable insights for early childhood center directors and classroom teachers as well as city, county and state officials.  

In 2018-19, our two foundations awarded funding to and to facilitate a pilot of the Kindergarten Readiness Indicators with a variety of early childhood providers in Indianapolis and Evansville. The pilot showed the results were predictive of how prepared children were for kindergarten but that too few were meeting national targets for kindergarten readiness.  

In 2019, we gave the intellectual property for the assessments to the State of Indiana, which adopted the KRI as a required test for . It was administered for the first time in April 2021, and the were released in early 2022. Across 320 early learning programs in 55 counties, most students were found to be performing below target academic indicators across all categories measured. There were also gaps based on gender and race — for instance, girls scored higher than boys on all oral language and literacy skills, and white students scored higher than Black and Hispanic children in five out of six categories in oral language and literacy skills, as well as math and spatial thinking.

The data are likely indicative of the impact of COVID-19 on even young learners. But given similar results during the pilot, which predated the pandemic, it is evident that much more must be done to support young learners and early childhood educators alike. With the KRI, Indiana now has a starting point to gauge against and improve upon, and the data will help the state, providers and philanthropy better allocate public and private resources to strengthen early childhood education. 

Pre-K has a tremendous impact on life and learning outcomes, which is why building early learning measurement systems is critical. As states recover from the challenges of COVID-related learning losses, they may find it effective to adopt similar strategies to what proved effective in Indiana: collaborating across organizations, education providers and state leaders to develop holistic solutions for students and educators.

]]>
Opinion: Teacher’s View: Testing Ability & Achievement Gives Insight into the Whole Child /article/teachers-view-testing-ability-achievement-gives-insight-into-the-whole-child/ Wed, 21 Sep 2022 16:01:00 +0000 /?post_type=article&p=696868 Achievement tests are necessary for evaluating student mastery and growth. Yet achievement data as the sole determinant of student success can unintentionally leave some children behind. 

As a former elementary school teacher, I saw firsthand the impact of a focus on achievement assessments. From day one of the school year, everything revolved around helping students pass the end-of-year exams. Yet, something felt off. I spent more time teaching test-taking strategies than I did understanding my students’ strengths and interests. And when I did feel that certain children were not performing up to their potential, I had no data to back it up.

Just as disturbing is the fact that a strong focus on achievement assessments creates systemic inequities. These test results reflect how well prepared students are for a particular exam, thanks to strong teachers, a solid curriculum and a home environment that can provide supports like tutors. Children from less affluent backgrounds may not have these advantages and, therefore, will not perform as well. 


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


So how can educators get the insights they need to help students unlock their true potential? By testing children’s cognitive abilities, also called aptitude or potential for learning. An ability assessment measures students’ key reasoning skills and provides insight into how ready they are to solve problems. Ability data evaluates students’ capacity to make inferences, apply familiar concepts to new environments, and detect similarities and differences. 

Ability is influenced by all experiences, both in and out of the classroom. Measuring ability looks at problem solving and critical reasoning skills that, while important for success in school and highly correlated with achievement, are much less dependent on formal schooling. Teachers can measure this potential for learning with a cognitive abilities test. Administered online or on paper, it asks students to work through a variety of questions to measure three forms of reasoning: 

  • Verbal, which includes analogies (identifying the relationship between two pictures or words), sentence completion (fill in the blank) and verbal classification (select the word or picture that belongs in the same category as three others). Verbal reasoning directly links to a student’s readiness to learn vocabulary and comprehend complex texts.
  • Quantitative, which includes number analogies, puzzles (i.e., compare quantities using symbols, pictures and numbers), and series (i.e., discover a pattern in a string of numbers). Quantitative reasoning skills indicate potential for working with numbers and mathematical relationships.
  • Nonverbal, which includes questions about shape and figure manipulation (i.e., mentally matching images of folded and cut paper), analogies and pattern/series questions. Nonverbal reasoning, when used alongside quantitative reasoning, is a powerful predictor of science outcomes.

At the end of the test, students receive a comprehensive abilities profile that summarizes strengths and areas of growth, and can also highlight instructional practices to meet each child’s cognitive development needs in the classroom. For example, if a student has above-average reasoning abilities, the instructional recommendations might suggest using guided discovery instead of structured teaching methods. 

Teachers can leverage ability data alongside achievement test results to identify gaps between a student’s potential and performance. If a child performs really well on an ability test but not as well on an achievement exam, that disparity can suggest what type of support that student needs. 

In the Plano, Texas, Independent School District, teachers plot students’ cognitive ability data alongside achievement data, with a specific focus on children whose performance levels in local and state assessments are not corresponding to ability levels. This information can help schools determine additional academic, social-emotional or behavioral support to address specific learning needs. 

Teachers can also use ability data to create a strengths-based classroom environment for students. Lessons that focus only on achievement data frequently tie instruction to closing achievement gaps or catching students up to grade-level standards. Students can identify the skills and standards they have not mastered, but they may not know how they learn best or where they excel. 

Charleston County School District uses ability profile information to create this sort of learning model, which lets students understand their innate critical reasoning strengths. During the school day, children have time to engage with enrichment activities that reinforce those strengths. For example, students with natural quantitative reasoning skills spend time playing educational games that stretch their abilities in number logic and analytical problem-solving.

Building on natural strengths instead of addressing gaps accelerates learning by making the classroom a positive environment and builds students’ self-awareness, confidence and curiosity. 

Adding cognitive ability testing to achievement assessment allows educators to flip the script. With achievement tests alone, the entire conversation focuses on what skills a child is missing and what standards he or she has not mastered. This is a negative approach to learning. But adding cognitive abilities tests helps shine a light on students’ innate potential and their strengths. With this information, teachers and administrators can create learning environments that capitalize on those abilities and build a much more positive experience for learning in the classroom.

]]>
Opinion: Test English Learners in the Languages They Speak at School and at Home /article/test-english-learners-in-the-languages-they-speak-at-school-and-at-home/ Tue, 30 Aug 2022 19:01:00 +0000 /?post_type=article&p=695759 According to the U.S. Department of Education, enrollment of English learners in K-12 has , representing 5.3 million students in U.S. public schools. Historically, this population has experienced inequities in educational outcomes due to factors including , , , and lack of . 

These disparities became during the pandemic despite that are supposed to ensure English learners have equal access to education. Reversing these trends and improving educational outcomes for children who speak a language other than English at home involves attending to their individual learning through both effective instruction and accurate assessment. 

Effective instruction means employing approaches differentiated by the students’ languages, culture and social experiences to address these significant educational disparities. Accurate assessment means measuring students’ levels of ability in given academic areas to provide educators with predictive information about which are on track for proficiency and which are at risk.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


But universal screening assessments are largely published and administered in English. Testing children in all the languages they speak would be ideal; however, more than are spoken in the U.S., and creating each exam and providing the tools and staff to administer them would not be easy. It would mean hiring educators capable of testing students in all languages and investing in the creation of each assessment — which requires researching the literacy and science of reading in each language as well as validating the assessment through field testing and item calibration. While not impossible, this is a costly endeavor that would take years to accomplish.

Since Spanish is the in the nation, and the first language for 75% to 80% of English learners, providing high-quality assessments in that language should be a priority.

Because language development does not necessarily happen at the same rate or in the same pattern in both languages, English-only approaches underestimate a child’s ability, generally providing more information about what students can’t do rather than capturing their actual skill set in their home language. This deficit-based approach may provide to guide instructional planning for these students. Assessing Spanish-speaking students in English only, without taking into consideration their English proficiency, can result in lower levels of performance, which may be . Therefore, accurate assessment is needed in both languages in order to truly understand a child’s language and literacy skills and to provide the appropriate level of instructional support. 

Learning to read is central to . Poor literacy can lead to low educational attainment, depressed wages and generational poverty. Reading scores as early as are highly predictive of life outcomes. 

Assessing bilingual students in both their home language and in English provides the of their overall ability. It is important for teachers to know what children are capable of in their home language, because they can leverage these skills during English and bilingual instruction to design effective educational experiences, including lessons, games, group work, that are tailored to the child’s ability levels in both languages. This is particularly true in the early elementary grades, when home language exposure has a much greater impact on proficiency than the language spoken in school.

Some experts in the field note that high-quality reading assessments for Spanish speakers should focus on the same critical early literacy skills as for English speakers. They should be efficient and easy to administer, and account for specific linguistic features of Spanish. For example, they should include letters that are particular to the Spanish alphabet, and passages should reflect that language’s syntactical, lexical and grammatical rules. 

As an alternative, teachers can support students by learning more about their home languages through parents or interpreters, and understanding which sounds English and the home language have in common. Sounds that are not present in both will be harder for a child to hear and say; knowing this will help a teacher better understand what will transfer between languages and what might present difficulty.

There is a critical need for educators to learn about and start using high-quality reading assessments with students in their home language, especially given the instructional loss that occurred during the pandemic, and particularly in literacy. It is a first necessary step toward identifying the right instruction to help English learners become successful readers.

]]>
Youngkin Says Report on ‘Honesty Gap’ Points to Decline in Virginia Schools /article/youngkin-says-report-on-honesty-gap-points-to-decline-in-virginia-schools/ Sun, 03 Jul 2022 14:01:00 +0000 /?post_type=article&p=691009 This article was originally published in

Pandemic learning loss and subpar standards have led to a significant decline in outcomes for Virginia’s K-12 students, Gov. Glenn Youngkin and his education appointees argued Thursday as they presented a new data analysis of school performance.

Pointing to what the described as an “honesty gap” between what state learning assessments show and how Virginia students fare on a national assessment, Youngkin suggested decisions of prior administrations created an inaccurately rosy picture of the state of K-12 education.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


At a news conference in Richmond, Youngkin called education “the singular most important issue for Virginia’s future” and said trends going in the wrong direction could jeopardize the state’s reputation for high-quality schools.

“The significant lowering of expectations, the lack of transparency with data, the weak accountability for these results, that all ends today,” Youngkin said.

Citing , Youngkin and his schools team said Virginia has an unusually wide gap between its state assessments and the National Assessment of Educational Progress, which tests samples of students from each state to produce a metric called “The Nation’s Report Card.”

For 2019, state Standards of Learning assessments showed 75 percent of Virginia fourth-graders proficient in reading, compared with 38 percent proficiency under the NAEP, according to the new report. The gap was wider at higher grade levels, with 76 percent of eighth-graders showing reading proficiency on SOLs, and 33 percent showing proficiency according to the NAEP.

To underscore some parents’ frustration with the state of public schools, the Youngkin administration’s report notes the number of homeschooled students jumped 56 percent in the 2020-2021 school year. That same year, the report says, 3,748 public-school students transferred to private schools in Virginia.

“We are not serving all of Virginia’s children,” Youngkin said. “And we must.”

The Youngkin administration’s analysis showed similar assessment gaps in math scores, and wider gaps in both math and reading for Black, Hispanic and low-income students. The governor’s office also presented data showing those achievements got worse due to pandemic-era school closures, with SOL pass rates dropping substantially between 2017 and 2021 for Black, Hispanic and low-income third-graders while white students showed a more modest decline.

The Virginia Education Association, an advocacy group representing Virginia teachers, blasted the Youngkin report as a political document that relied on “blatant manipulation of data” to “disrespect and belittle the amazing work Virginia educators have done, and continue to do, under incredibly difficult circumstances.”

“If Governor Youngkin is concerned about an ‘honesty gap,’ he need look no further than his own office to find it,” VEA President James J. Fedderman said in a news release.

When asked if the group has any specific critiques of Youngkin’s methodology, a VEA spokesman said the organization was still reviewing the report and expects to have “a more detailed rebuttal” next week.

Education Secretary Aimee Guidera said the data makes an “irrefutable case that this state has not been serving all students well,” a conclusion she said was obscured by past leaders shifting standards and expectations.

“And they often did this in the name of equity,” Guidera said. “President Bush used to refer to this as ‘the soft bigotry of low expectations.’ I call it plain rotten. We cannot afford to lose another generation of our children because of our inability to hold ourselves, our schools and our students to high expectations.”

Youngkin’s education agenda, which has focused largely on ending pandemic-related measures like online learning and mask mandates, giving parents more input into school operations, and expanding charter schools and other alternatives to traditional public schools, has seen mixed results so far in the General Assembly. Democrats have resisted charter schools, prompting Youngkin to pursue in partnership with colleges and universities. He was successful in winning some bipartisan support for legislation to end mandatory masking in schools and notify parents of sexually explicit reading assignments. The still-unfinished state budget is expected to include significant new funding for K-12 education and teacher pay raises.

After the failure of legislation meant to deliver on his campaign promise to rid Virginia schools of so-called critical race theory, a catchall term conservatives use to describe a variety of racial equity and diversity initiatives in K-12 schools, Youngkin has used his executive powers to try to purge the concept of equity from the state’s education bureaucracy.  He also drew strong criticism for setting up a confidential email tipline allowing parents to lodge complaints about allegedly divisive teaching or purported examples of critical race theory.

The tipline and Youngkin’s rhetoric about “restoring excellence” in Virginia schools drew a sharp rebuke earlier this year from the Virginia Association of School Superintendents, which accused Youngkin of presenting an “inaccurate assessment of Virginia’s public education system currently and historically.”

“Again, by most measures, Virginia ranks near the top and surpasses most states throughout the country,” the superintendents’ organization wrote in the March 10 letter. On Thursday, the VASS said it was in the process of reviewing Youngkin’s new report.

“As always, we remain committed to the highest standards for public education in Virginia and hope that we can work with the administration in ascertaining and facilitating the resources and support referenced in the report that will be needed for all children to succeed at those standards,” VASS Executive Director Ben Kiser said in an email.

Proponents say equity-driven initiatives allow for a fuller reckoning with systemic racism and realign resources to address lingering educational disparities in a former Jim Crow state famous for fighting to block racially integrated schools. 

Youngkin has said he supports teaching all Virginia’s history, but he contends equity initiatives encourage overbroad racial stereotyping and division. Among the seven priorities laid out in his new education report is “zero tolerance for discrimination,” described as barring “the ascribing of traits or behavior based on race, gender, political beliefs or religion.”

“We shouldn’t be teaching our children to be judgmental,” the governor said.

In a statement, Senate Democratic leaders ripped the Youngkin report’s assertions as “an outright lie,” “a joke,” “tomfoolery” and “dog-whistle talking points.”

“We all know Governor Youngkin’s end goal — to erase Black history and any mention of equity from Virginia’s curricula,” Sen. Louise Lucas, D-Portsmouth, who chairs the Senate’s Education & Health Committee, said in the news release. “This misguided effort based on fake news and debunked theories is an outright attack from the far right, riling up racist constituencies with lies and deceit. This report shows once again that Governor Youngkin wants to take us back to the days of Jim Crow.”

Joining Youngkin for Thursday’s announcement was former Gov. Doug Wilder, who was elected as a Democrat in his history-making campaign for governor in 1989, but more recently has made a habit of criticizing Democrats and supporting Youngkin.

Though Wilder didn’t speak from the podium during the event, he huddled with Youngkin afterward as reporters looked on, praising the governor’s call for administrators, teachers and parents to work together to put students first.

“I wouldn’t be here today if I didn’t believe in you. God bless you,” Wilder told Youngkin. “I hope you have continued success.”

is part of States Newsroom, a network of news bureaus supported by grants and a coalition of donors as a 501c(3) public charity. Virginia Mercury maintains editorial independence. Contact Editor Robert Zullo for questions: info@virginiamercury.com. Follow Virginia Mercury on and .

]]>
Opinion: Teacher's Perspective: Assessing Students Properly Requires More Than One Test /article/educators-view-assessing-students-properly-requires-more-than-one-test-and-some-of-them-may-not-be-what-you-think/ Tue, 01 Mar 2022 12:10:00 +0000 /?post_type=article&p=585661 Nearly two full years into the pandemic, schools and students across the country are continuing to grapple with disruptions, which have led to . As many states begin to rethink their approach to testing during this difficult time, it will be critical that they use a to better understand students’ academic and social-emotional needs.

Despite some of the negative impacts of high-stakes state testing, can be a critical and helpful tool for both teachers and students. This is especially true of predictive and informative tests that provide educators with real-time direction for what students need to learn in order to meet academic and non-academic standards and to hone essential skills. As a fifth grade teacher from 2016 to 2018, I used various assessments to develop daily lesson plans, craft student-specific learning interventions and share students’ progress with them and their families.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


But these “assessments” may not be what you think.

First, to best support my students daily, I assessed each child’s social and emotional well-being.

My students’ success — measured by both improved state test scores and individual growth — was mainly due to the strength of their relationships and classroom culture. Without strong relationships with and between students, little will be achieved in the classroom. To assist in building relationships and classroom culture, I checked in with each student every morning as they entered the room. This informal daily routine was crucial.

One morning, for example, a student told me her father had been released from prison, but was arrested again the following morning. She had looked forward to him coming home, only to have him ripped away the very next day. While she was physically present in school, learning was not something she could focus on without appropriate support. Fortunately, she felt comfortable sharing with me. This, in turn, allowed me to support her by making time and space for her to speak with the school counselor, talk with her best friend, write in her journal and receive ongoing social and emotional supports. Despite the traumatic events that had unfolded just hours before she arrived at school, she received prompt assistance, largely due to the trusting relationship we had previously built. Her classroom environment was empathetic, and she was able to end the day with a firm grasp of the academic material presented.

A second assessment I relied on was the (DIBELS). This short, predictive test measures a student’s grasp of literacy skills, including reading fluency, comprehension and phonics, in real time. Each week, I DIBELS tested all below-grade-level readers, helping identify support areas and develop student-specific reading interventions.

For example, another student of mine had entered fifth grade reading at a second-grade level, partly because of prior suspensions and expulsion from school when he was in fourth grade. I became a teacher so Black and brown students like him would have a Black educator they could see themselves in — a teacher who would not look at them as a problem, but as a whole child, accepted, loved and welcomed in our classroom.

Each week, this student took the DIBELS assessment, which helped me develop specific supports to help improve his literacy skills. But although DIBELS was a great tool for understanding precisely where he needed support, nothing was more valuable than how he responded to his own progress. It was clear how much his confidence had grown and how proud he was of himself. This, coupled with student-specific data and a culturally responsive classroom, helped him grow from second-grade to a fifth-grade reading level in just the first half of the school year.

The third type of test I relied on were quizzes following each lesson. By checking for understanding, I was able to reteach specific skills or material and identify any student or classwide misunderstandings.

For example, after reviewing a short formative assessment — an exit ticket — after a math lesson on the order of operations, which some may know as “PEMDAS,” or “Please Excuse My Dear Aunt Sallie,” I realized that five students had the operations out of order. These students were adding before they multiplied, which led them to the wrong answer. Having access to real-time, student-specific data helped me identify not only which children needed extra support, but also where they needed it.

High-quality, standards-aligned assessments, developed to intentionally reduce unintentional biases, are vital for educators to meet the daily needs of their students. Especially when they are components of a more extensive teaching and learning system, one that includes high-quality instruction, materials and classroom environments where students feel accepted, heard and loved.

Jamil Modaffari is a research assistant for K-12 Education Policy at the Center for American Progress and a former fifth-grade teacher.

]]>
Florida Governor’s Plan to Nix End-of-Year Tests Lacks Details /florida-governors-plan-to-nix-end-of-year-tests-might-be-popular-but-experts-wait-for-the-details/ Wed, 15 Sep 2021 15:30:00 +0000 /?p=577705 Florida Gov. Ron DeSantis, considered a possible GOP candidate for president in 2024, scored some points with educators Tuesday when the end of the state’s testing program. But some experts wonder whether teachers and administrators will like what the state puts in its place.

A new Florida Assessment of Student Thinking, which the state legislature still needs to approve, would involve three “progress monitoring” tests spread throughout the school year. DeSantis called the plan the “final step to eradicate Common Core from our assessments.”

Last year, the state dropped Common Core standards, which many Republicans associate with the Obama administration, and is phasing in . To comply with the federal Every Student Succeeds Act and receive federal funds, however, the state would still have to test all students in reading and math, produce end-of-year results and share the data with parents.


Get stories like this delivered straight to your inbox. Sign up for Ӱ Newsletter


The “announcement feels like somebody trying to make a point with teachers and parents, but the devil is in the details,” said Paige Kowalski, executive vice president of the Data Quality Campaign, a nonprofit that focuses on making education data clear to parents.

The governor’s announcement comes amid growing anti-testing sentiment and complaints from educators that testing takes too long, often offering unhelpful results after students have moved on to the next gradel. With state tests cancelled in 2020 because of the pandemic, teachers have also been relying more on programs such as NWEA’s MAP assessments to gauge how the pandemic has impacted students’ progress. Federal law doesn’t require states to test in the spring, and under an existing , some states, such as Georgia, are already trying interim tests throughout the year to minimize emphasis on end-of-year exams.

But experts say there are downsides.

“If they take the current test and cut it into three pieces, spreading it out over the year, it’s perhaps not that big a deal,” said Dale Chu, a senior visiting fellow with the Thomas B. Fordham Institute, a conservative education think tank. “But if schools didn’t like the ‘high-stakes’ nature of annual testing, they’ll be in for a rude awakening when the pressure’s on three times a year.”

Kowalski added that districts might not want to give up “benchmark” tests, such as MAP, Renaissance Learning’s Star or Curriculum Associates’s i-Ready, because teachers find them useful. If the new Florida Assessment of Student Thinking — or FAST tests — are layered on top of those, schools could find themselves giving more tests throughout the year instead of less.

Another possibility is that districts might stop paying for MAP or a similar test, leaving teachers with fewer data points to know if their “kids are on track,” Kowalski said.

Testing all students once a year in two core subjects sounds like a simple charge, she added.

‘But we haven’t been able to nail it,” she said. “How are we going to approach an innovative assessment system that you need a chart to explain?”

Patricia Levesque, executive director of Foundation for Florida’s Future — part of the Foundation for Excellence in Education launched by former Florida Gov. Jeb Bush — raised about the plan. One is whether teachers would be required to teach on Tallahassee’s timetable in order to be prepared for the three statewide tests and another is whether the spring test would simply replace the end-of-year test, giving teachers “less time to cover the full year of content.”

The testing program DeSantis is ending was a centerpiece of Bush’s two terms as governor.

DeSantis’s proposal applies to standardized tests for English language arts and math, but doesn’t eliminate high school end-of-course tests in algebra, U.S. history and biology.

Teachers unions , and Miami-Dade County Public Schools Superintendent Alberto Carvalho — even though he’s been at odds with DeSantis over his ban on universal masking in schools — the move.

Chu noted that even though the U.S. Department of Education required states to give tests this year, officials have allowed considerable flexibility with COVID-19 continuing to disrupt learning. Some states were allowed to delay spring assessments until this fall, the District of Columbia hasn’t conducted state tests for two years, and California allowed districts to choose which tests to administer.

“In today’s environment,” he said, “it’s hard to see the feds pushing back that hard.”

]]>
Tennessee’s Schools Chief Talks About How to Help Students Catch Up After COVID /article/tn-education-commissioner-penny-schwinn-value-of-assessments-pandemic/ Thu, 06 May 2021 19:01:00 +0000 /?post_type=article&p=571183 Get essential education news and commentary delivered straight to your inbox. Sign up here for Ӱ’s daily newsletter.

Following a , during which he pointed to data from statewide assessments as being “among our most valuable tools” in helping students recover from pandemic-related learning losses, the Collaborative for Student Success reached out to other state officials across the country who have been leading on data collection to guide efforts in accelerating learning.

Tennessee Education Commissioner Penny Schwinn agreed to share her department’s rationale on moving forward with state assessment this year — and how the state now plans to use the data to benefit schools, parents, and students.

This discussion was organized after Tennessee lawmakers pausing much of the state’s test-based accountability system, and after Schwinn had publicly defended the administration of tests, saying “it’s important to know how our kids are doing, and it’s important for our families to know.”

Q: Why are you committed to the administration of statewide testing this year and the use of collected data to help guide educational recovery efforts in Tennessee schools?

Schwinn: We know student assessments help families and educators get an accurate picture of what our students know, where there are opportunities for growth, and how they can best support students. Moreover, we owe that level of honesty and transparency to our families to ensure they can better partner and support their child’s growth and progress.

Simply put, when we are able to measure student growth and learning through statewide assessments, we are able to best focus our efforts and student supports. We’re tremendously grateful to share that priority and understanding with both the Governor and Tennessee General Assembly, and as we support districts in their strategic ESSER [Elementary and Secondary School Emergency Relief Fund] spending, we want to ensure we have that critical roadmap for responsible investments.

How do you envision your staff and the Tennessee Department of Education using the data you collect this year to make decisions about what’s next for public schools in Tennessee? 

Tennessee’s accountability system is central to the academic progress made in the state over the last decade and provides a critical view of how well educators, schools, and districts are serving all students.

Our team relies on the data to support strategy, best practices, and implementation across all our districts. Only with that information can we strategically invest money, time, and energy that our students and communities deserve. Understanding where our students are at this year will continue to provide critical insights to our state, districts, and educators as we tackle those learning gaps head-on.

How were you, other education leaders, and advocacy organizations in Tennessee able to reach broad agreement on the importance of statewide testing this year? 

Tennessee is fortunate to have an active, committed group of education stakeholders who share the same belief of the importance of statewide testing this year. We know those honest conversations about data and growth are essential to move the needle further, farther, and faster than has occurred before.

Some of your counterparts had sought to seek a blanket testing waiver from the Education Department, which the Biden administration has said it won’t do. Do you believe that would have been good policy? Would such waivers have led to increased pressure on you and your counterparts to suspend testing? 

Regardless of USDoE action, Tennessee will hold the line on assessments because we owe that level of reporting and transparency to our taxpayers and communities. Districts and educators need a roadmap for strategic, responsive supports. Moreover, parents deserve clear, easy-to-understand information about how their child and their child’s school is performing.

That being said, we also know the COVID-19 pandemic has created the need for common-sense flexibilities regarding student grades, educator evaluations, and school and district accountability. Tennessee’s hold harmless legislation ensures we can maintain priorities for honest reporting while also recognizing this year has been anything but normal, and our education community has been impacted as a result.

Should the federal government be offering waivers for state accountability systems while requiring that the tests are still administered to every student? Do we need the data if not for accountability? 

In Tennessee, we appreciate the Governor and General Assembly’s commitment to hold harmless for the 2020-21 school year, removing any negative consequences associated with evaluations and accountability, when districts meaningfully participate in state assessments. On Jan. 21, the Tennessee General Assembly took action by passing the Accountability Hold Harmless Law (SB7001/HB7003) to hold schools, teachers, and students harmless from negative consequences resulting from the 2020-21 TCAP assessments. This law excludes student growth data generated from this year’s TCAP assessments from a teacher’s evaluation unless such inclusion results in a higher overall final evaluation score for the teacher. In turn, that means assessment scores would only be incorporated into evaluations or accountability measures if it benefited the teacher, school, or district. We want to reward the tremendous work of our educators, districts, and students who grew despite the challenges and disruptions of the year. Further, we want to have an accurate roadmap for the 2021-2022 school year.

Should districts simply be able to decide which test to administer this year, rather than the TCAP? Why is it important that the statewide test be administered on top of any local testing?

Statewide tests are essential for that continuity and understanding of growth over time. Though we value that local decision-making, we can’t compare dissimilar tests and varied assessments district to district. TCAP provides us the accurate picture of where Tennessee students are and what supports are needed to offset any learning loss. Further, it gives the state a clearer sense of the areas of opportunity and growth we can foster and support across regions.

Related — A few of our other recent interviews with key education leaders about the pandemic: 

  • Social-Emotional Learning: Expert Elizabeth Englander on preserving SEL development during the pandemic, the key to managing screen time — and why families should eat dinner together (Read the full interview)
  • Equity: 4 Black mothers reflect on parent activism, self-determination and the fight for educational change post-pandemic (Read the full interview)
  • Family Engagement: National Parent Union’s Keri Rodrigues on public school disenrollment amid the COVID crisis (Read the full interview)
  • Go Deeper: See our complete archive of 74 Interviews

]]>