Assessment Q&A

Picking an Interim Assessment? Do This First, Say School Leaders

By Sarah Schwartz — July 31, 2023 5 min read
Illustration of papers and magnifying glass
  • Save to favorites
  • Print

One of the many responsibilities that falls to school district leaders is whether to use an interim assessment—one that’s given every couple of months—to measure student progress.

These tests can serve several different purposes, including predicting performance on state exams and identifying subsets of skills for which students might need support.

Picking the right tool is a high-stakes decision. Teachers may use the results of these tests to adjust their instruction or determine which students will receive interventions. But it can also be hard to identify exactly what test will best suit a district’s specific purposes.

Last year, the nonprofit curriculum reviewer EdReports announced that it would start releasing reviews of interim assessments as well, judging their technical quality and usability. These kinds of outside evaluations are hard to come by right now, as they’re often proprietary, created by private companies.

But earlier this year, the organization put the plan on hold indefinitely, because not enough assessment companies agreed to participate.

See also

A small group of diverse middle school students sit at their desks with personal laptops in front of each one as they work during a computer lab.
E+/Getty

It was disappointing news for Christine Droba, the assistant superintendent of teaching and learning in North Palos School District 117 in Palos Hills, Ill.

“An external review would be huge,” she said, removing some of the burden of assessing validity and reliability of these tests from teachers’ and other educators’ shoulders.

Droba and North Palos superintendent Jeannie Stachowiak spoke with Education Week about how their district chooses interim assessments, and what they did after discovering that the test they were using wasn’t aligned with the year-end test in Illinois. This state test is used for federal accountability purposes.

They also shared their advice for other school leaders wondering about the alignment of interim assessments to their teaching.

This interview has been edited for length and clarity.

How does your district use interim assessments?

Droba: We use our interim assessments as a tool to predict how students are going to do on the state assessment. It is something that we use to identify which students need enrichment, which students need additional support in terms of: Are they going to be ready for the end of year benchmark?

We look at, where’s grade-level proficiency? How close is it to our target? How many students are below that level? What do they need to do to get to the end of the year benchmark? Which students are going to start intervention? That’s our fall assessment. By the winter, we track progress from fall to winter, and then revise any plans that we have—if we need to do more support in this area, or maybe less support in this area.

And then the spring testing session is really done to track growth from fall to spring, and we also use the spring assessment to continuously make sure that the interim assessment is aligned to the state assessment. We’re always looking at: Are these numbers showing us the same thing?

How did you figure out that your interim assessments weren’t aligned?

Stachowiak: [The assessment we were using] is not directly aligned to Illinois State Standards.

We had teachers, understandably, taking a look at some of the things on the [interim] assessment and beefing up their instruction in those areas. However, those were not areas that were target areas for assessment on a state assessment. So they were working very hard to make sure students met standards on an interim assessment that was really not aligned.

We started to look for other potential assessments that would be better aligned, which was when we made the switch [to a new interim test]. We have a data coordinator in the district who meets with our leadership team, constantly. And we are looking to do a data dive to make sure that [the new interim assessment] is a better predictor for our students.

Droba: We worked with our data coordinator to take the assessment from the spring and then the [state] assessment data for the same group of kids. And he ran a correlational study to figure out what was the correlation between the two data sets. I believe that number was around 0.7 or 0.8, which is very high. He was basically saying that these numbers are correlated.

That was similar to the research that [the interim assessment provider] already presented to us. Their correlations were a little bit higher than what he found with our data set. But it was still high enough that we were like, “Yeah, let’s move forward, this is still good. It’s in alignment.”

What advice would you give to other districts that may be having similar questions?

Droba: I would say that you need to have clarity on your goals and priorities. First and foremost, we made it very, very clear that the state test is what’s measuring the state standards. It’s what our whole curriculum and system is built on.

[The interim assessment] is a tool to predict how kids are going to do on the state assessment. So we have very clear priorities. If you don’t have that, it can be very confusing. Which assessment? What am I looking for? What’s the purpose of the assessment? You really need to have clarity, first on the purpose of the assessment and how you want to use it.

Stachowiak: We value the state assessment, because we believe it measures what our teachers teach and what our students should learn. And then based on that value, we create goals for the school district. We share those goals with the Board of Education, obviously. We share those goals with the teachers, so that at all of our professional learning, community meetings, everything that we’re doing with our staff, that’s the goal in mind—to make sure that the students are going to achieve those goals that we expect.

If you don’t have that goal in mind, and that alignment, it’s really difficult to make sure that everyone is sharing and doing the same thing and valuing not only the state assessments, but then whatever interim assessments you’re using to measure.

Is there information that you would want publicly available about interim assessments that you don’t have access to?

Droba: An external review would be huge.

A lot of what we get is from the company itself. They’re going to give us this report that says, “Yes, it’s aligned to IAR. Yes, we do that.” They do their own research. Having an external reviewer would help just make sure that their methods were valid, that everything meets the high-quality standards that you would expect.

We read through the reports that they provided to us, and then we piloted the program to make sure that in practice, it was what we wanted it to be. But having the external review would just provide a set of eyes that was not the company.

We work with teachers on our review committee, and they know the usability of it. But to have a research company explore the validity and reliability of an assessment, that just allows [teachers] to review that instead of having to actually do the review themselves.

Events

Student Well-Being K-12 Essentials Forum Boosting Student and Staff Mental Health: What Schools Can Do
Join this free virtual event based on recent reporting on student and staff mental health challenges and how schools have responded.
This content is provided by our sponsor. It is not written by and does not necessarily reflect the views of Education Week's editorial staff.
Sponsor
Curriculum Webinar
Practical Methods for Integrating Computer Science into Core Curriculum
Dive into insights on integrating computer science into core curricula with expert tips and practical strategies to empower students at every grade level.
Content provided by Learning.com

EdWeek Top School Jobs

Teacher Jobs
Search over ten thousand teaching jobs nationwide — elementary, middle, high school and more.
View Jobs
Principal Jobs
Find hundreds of jobs for principals, assistant principals, and other school leadership roles.
View Jobs
Administrator Jobs
Over a thousand district-level jobs: superintendents, directors, more.
View Jobs
Support Staff Jobs
Search thousands of jobs, from paraprofessionals to counselors and more.
View Jobs

Read Next

Assessment Cardona Says Standardized Tests Haven't Always Met the Mark, Offers New Flexibility
The U.S. Department of Education is seeking to reinvigorate a little-used pilot program to create new types of assessments.
7 min read
Education Secretary Miguel Cardona speaks during an interview with The Associated Press in his office at the Department of Education on Sept. 20, 2023 in Washington.
Education Secretary Miguel Cardona speaks during an interview with The Associated Press in his office at the Department of Education on Sept. 20, 2023 in Washington.
Mark Schiefelbein/AP
Assessment Opinion The 4 Common Myths About Grading Reform, Debunked
Grading reformers and their critics all have the same goal: grades that truly reflect student learning. Here’s how we move forward.
Sarah Ruth Morris & Matt Townsley
5 min read
Venn diagram over a macro shot of A- on white results sheet. Extremely shallow focus. Letter grades are highlighted.
E+/Getty + Vanessa Solis/Education Week
Assessment If ChatGPT Can Write Virtually Anything, What Should a National Writing Exam Test?
That's a question the board that oversees the National Assessment of Educational Progress is confronting amid AI's rapid ascendance.
6 min read
Image of a person using a computer, with glasses, papers, and pencil on the desk too.
iStock/Getty
Assessment From Our Research Center Few Educators Say A-F and Numeric Grades Offer 'Very Effective' Feedback for Students
Fewer than 1 in 6 educators—13 percent— say that A through F or numeric grades are a “very effective way” to give feedback to students.
3 min read
Cropped image of teacher standing in front of a blurred classroom of students with test results in hand showing the letter A in red.
E+