Editor’s note: This is an installment of Fast Forward, a recurring column focused on long-term cultural and technological shifts impacting public education. This edition focuses on states’ efforts to find new approaches to accountability.

Assessments and accountability plans, put on pause by the U.S. Education Department for the 2019-20 school year due to COVID-19, are expected to resume this school year with addenda. 

“We are skeptical states can return to status-quo school accountability systems,” said Chris Domaleski, associate director for the Center for Assessment, during a webinar hosted by CCSSO in October. “There is no easy answer.”

Status quo includes high stakes, like school report cards, which many states are moving away from this school year.

“We need to know where kids are and understand where some of the gaps are, [so] there is great interest in the data,” said Chris Woolard, senior executive director for performance and impact for the Ohio Department of Education. But, there is “a little bit more hesitancy on the accountability side of things.”

Considering usual indicators like absenteeism, graduation and assessment data have drastically changed in recent months, states — some of which were already in the amendment process prior to the pandemic — are rethinking data collection, interpretation and implications for the 2020-21 year. Added to those variables are conditions of assessment, instruction and content delivery that have varied significantly from past years and make score comparability, which states use to determine growth and school designations, a long shot.

As a result, state officials say they are having to consider changes big and small as the February 2021 addenda deadline from the U.S. Department of Education looms. They describe scrambling to provide as much information to schools as possible while reimagining a system without much-needed information, like 2020-21 assessment data. Even so, the reimagining could be the first domino in a series of changes to school designations, growth timelines and trajectories.

Designation, improvement timelines depend on data

As many states grapple with at least a year’s worth of gaps in their data and calibrate their accountability systems, some are proposing smaller, technical changes like N-size (the number of students within a subgroup or content area), while others are significantly tweaking indicators and indulging different growth models.

Though the purpose of these changes is to continue providing schools with targeted resources and support this year as schools focus on recovery, the changes bring into question long-term implications of performance outcomes, school designations and improvement timelines. 

“People are pretty eyes-wide-open about this as they’re walking into this year,” said Juan D’Brot, senior associate at the Center for Assessment, which helps states design and implement accountability systems. “They’re also planning for if we can’t implement accountability status quo like we did in 2018-19, then how do we use this skip-year growth methodology and the calculation of our remaining components for research purposes to influence policy?” 

Nebraska, for example, is making “a seismic shift” to its assessments, according to Lane Carr, and is among states planning to request a gap year in its addenda to the Education Department so previous years’ designations roll over to 2021-22. “We functionally probably cannot make an accountability system happen due to these big changes,” said Carr, the state education department’s director of accountability.

Other state officials Education Dive spoke with are also wondering if they will have the data available to designate improvement cohorts, which usually run on multiyear cycles. They are working to get stakeholders’ input before approaching the Education Department about changing their designations, timelines and evaluations of Comprehensive Support and Improvement, Targeted Support and Improvement, and Additional Targeted Support and Improvement schools. 

Addenda documents received by states from the U.S. Department of Education show it will consider allowing SEAs to shift timelines for school identification by one year, as well as timelines for measurements of interim progress and long-term goals.

“So the question for some states is probably: how do we know if they’ve improved if we don’t have the same data to look at?” said Maria Harris, deputy superintendent of assessment and accountability for the Oklahoma State Department of Education. 

Woolard, whose state is also grappling with how data will be used, said Ohio is considering seeking flexibility from the Education Department to postpone identification of new schools, while continuing supports for the ones already identified prior to the pandemic. “I think that, just generally speaking, people are concerned with being careful of what the consequences are of [designating additional schools],” Woolard added.

Are baseline changes on the horizon? 

It’s possible changes to performance outcomes could mean a big-picture reset on baselines, states’ goals and trajectories. 

“If that’s the case, then does that mean we need to reset the baseline to find a new trajectory moving forward?” D’Brot said of baselines first set with the Every Student Succeeds Act (ESSA). “Some states might want to say let’s look at our spring ‘21 data, and our ’18 and ’19 data, because we don’t want to fully account for those that are underperforming in [2020].” 

But, D’Brot added, states won’t be able to determine whether the baseline data needs to be reset until spring testing at least starts or data around other measures of progress are in. 

“The biggest thing that states need to be thinking about in real time is evaluating these different components to figure out when the year is over, Harris said. Where are we? How much data do we have and how consistent is this data? How complete is it?” 

Adam Baker, press secretary for the Indiana Department of Education, said his state expects “to reevaluate the accountability system as a whole” once performance data from the 2020-21 school year are collected and finalized. 

“At that time, a review will be conducted to determine the overall impact of COVID on state performance, in all areas assessed by the accountability system, and identify whether any changes or adjustments need to be made to long-term goals for indicators (such as a reestablishment of baselines and timelines), or to the performance thresholds/cut scores used to determine designations on individual indicators and overall designations,” he told Education Dive in an email.

Still, Baker and Domaleski said its too early to tell whether the Education Department would grant requests to reset baselines.

New ways of thinking about data

For this school year, many state education departments are considering detaching accountability’s high stakes at the very least. “With COVID, we don’t want to send signals [around performance] when we know the data isn’t consistent or complete,” said Harris.

Assessment and accountability experts have noted that, many times, school accountability report cards can impact everything from real estate values to how personnel and resources are distributed between schools. 

Instead, Oklahoma is considering releasing the information through a secure portal where only district administrators have access to it.

Carr said he hopes this mindset shift could last beyond the pandemic, considering it’s been difficult for the state to “get away from blaming and shaming” and refocus accountability on support and improvement. 

“[With COVID-19], our real focus is about how we can use assessment and accountability to drive instructional decisions in the classroom,” Carr said. “That was always our vision pre-pandemic, and now it has to be [like that] this year — well why can’t it be that way all the time?” 

Harris said the pandemic is “changing the narrative” around data reporting and its purpose so schools and the public don’t only see a letter grade or a number of stars, but also ask questions about how to improve and provide necessary support. 

“This flexibility has allowed us to say look, we don’t have [one year’s data],” she said. “But that doesn’t mean we don’t have [older] data that we can look at. We can start thinking about how this information can impact our kids and what they need to be successful.”

Source Article