WHEN USED WELL DATA ARE A POWERFUL TOOL FOR CHANGE

05/18/2016
TECHNOLOGY
By Christine Lyons

One of the best things about delivering professional development and consulting in schools is working side by side with teachers and administrators to answer questions and solve problems by drilling down into data until we find root causes so targeted plans can be developed that address the real heart of the issues.

Some common threads we’ve found in all this work have helped us see:

  • There is no shortage of data available.
  • The heavy lifting of data analysis is best received when it answers the question: How does this help my teaching or my school?
  • For the most part, teachers and administrators want to analyze data but are not always sure how best to go about it.
  • This sometimes leads to time spent going down ultimately unproductive tracks.
image

The process also demonstrated to them the power of thorough disaggregation and that data need context to make sense.

These observations have coalesced into two overarching principles:

  1. We don’t know what we don’t know, and
  2. When used well, data are a powerful tool for change.

As well as a series of guidelines we use when working with schools and districts:

  • Data need context to make sense.
  • Always analyze first, and then draw conclusions.
  • Root causes are rarely obvious — even if they seem to be.
  • Thoroughly disaggregating data is essential.
  • Not all data are useful or necessary.

Until we have carefully examined a situation we can’t possibly know everything about it. Drilling down is critical to making informed decisions that lead to productive actions. An enormous amount of time can be lost developing and implementing plans to deal with situations that haven’t been fully uncovered.

On the other hand, when we use data well — carefully, thoughtfully and thoroughly — it can be illuminating and enormously time saving. Having these principles and guidelines in mind when addressing a concern helps to streamline the analysis and action planning process.

To help illustrate this, here’s a story of a middle school that was working hard to improve its writing scores. Overall results from their state mandated assessments for the past few years were low and not improving. To address this issue they decided the Language Arts teachers would collaboratively develop and/or select common writing prompts to use in all English Language Arts (ELA) classes. The genre chosen for each grade matched the tested genre on the state test. The teachers developing the writing prompts were experienced, capable and motivated. Their principal was highly supportive and provided time for them to organize the project, collaboratively score student work and discuss results. They used the statewide scoring rubric so they could compare scores from their prompts directly with state prompts. Everyone was putting in a great deal of thought, time and effort to the initiative. Consequently, they were very disheartened when they did not see scores on their own prompts improve over the course of several months. Their principal asked them what they felt next steps should be. They weren’t sure and admitted to feeling frustrated. He asked if they would like an objective outside perspective, and they said yes.

Their principal contacted an outside company to review the process they had used, help analyze the data and provide insights. To start, they sent all their raw writing data at the state level for the previous four years, and their local data for the current year to analyze. In addition to running whole school and whole grade results, the data was disaggregated by a variety of typical subgroups — regular ed, special ed, Title I, LEP — as well as gender, years in district, feeder school, and type/genre of writing. The biggest difference in scores was on gender. This pattern persisted when each of the program groupings was disaggregated by gender as well. Girls outperformed boys across the board. Now there was data, but context was needed to help explain it.

Next there were meetings with the ELA staff and the principal. After reviewing the results in general the disaggregated results by gender were a focus, highlighting that girls consistently outperformed boys. Next step was to figure out why. To do that the writing prompts were reviewed for high quality construction and then other data was looked at to see if the gender gap persisted across subject areas.

There was a quick mini-lesson on what constitutes high quality writing prompt. Then the writing prompts were posted around the room and there was a carousel brainstorm to analyze the clarity of the directions and the clarity of the prompts themselves. Some suggestions for improvement were noted, but something much more powerful happened. As everyone paused to collect their thoughts, one of the team members commented aloud, “If I were a middle school boy, I don’t think I’d find any of these prompts very interesting.” Bingo! Everyone took another look at the topic of the prompts and concurred that the vast majority were more likely of interest to girls. It wasn’t until there was an opportunity to see all the prompts in one place that that pattern became clear — and connected to the results seen in the data.

At that point, the teachers noted that they often heard, “This is boring!” from the boys but had interpreted that as writing in general being boring. It hadn’t occurred to them that the nature of the prompts might be affecting performance. They were so focused on developing a lot of prompts across a variety of genres that they lost the forest for the trees.

This segued quite nicely to a conversation on test bias. There was a discussion about how bias can unintentionally creep into assessments even when they are designed with the best of intentions and how to guard against bias. A plan was created to revise the prompts themselves as well as the process for administering them.

After some debate, it was agreed that students would be allowed to choose their own topics for the majority of assignments. There was discussion about whether or not that would be unfair or inconsistent for students but it was ultimately determined that the instructional and assessment focus should be on the structure of the writing, not the topic. For example, procedural writing could be taught consistently and with the same expectations whether the ultimate piece of student writing dealt with how to build a birdhouse, make a batch of cookies, or do a lay-up in basketball. The prompts would be reworked with expectations clearly focused on genre, leaving the topic of the assignment to the students.

Teachers were very satisfied at the end of this process. They felt that a real problem had been solved and that they had a clear, well-organized plan in place for curriculum, instruction and assessment that would have a positive impact on their teaching. They also learned something about assessment development and use they hadn’t known before and would be able to apply going forward.

The process also demonstrated to them the power of thorough disaggregation and that data needs context to make sense. Numbers alone, especially summary data of group performance, will rarely if ever give you a clear picture of what is really happening, let alone help make connections in complex situations. But when used well, data are a powerful tool for change. In this case, writing scores consistently improved over time for both boys and girls.

Christine Lyons is CEO Dragonfly Educational Consulting Services. For more, visit www.dragonflyecs.com
Comments & Ratings
rating
  Comments

There is no comment.

Issue 18.3 | Winter/Spring 2017

Southeast Education Network

Our Mission: to reinvigorate the spirit of American education