7 Steps to Becoming a Data-Driven School
Learn to access the right data: the why (why do your students not get it?) and the how (how can you reteach content to stick?).
Your content has been saved!
Go to My Saved Content.At New Mexico School for the Arts, we tried a variety of approaches for using data to improve student achievement, and we didn't get it right at first. This is what we learned:
What Didn't Work For Us
We gathered as a faculty to pore over data from state reading and math tests. We printed out student-by-student reports from a commercial short-cycle assessment and dissected them trying to find information that we could use to inform instruction.
Unfortunately, the most focused prescription we could usually cull from the data sounded like: "This student needs more practice with informational texts," or "This student needs more work on statistics." The data told us which students were strong and which were weak in a given subject, but we couldn't find the details we needed for meaningfully adjusting our instruction to move student achievement. This type of data gave us the what (in what general skill is this group of students not proficient?), but not the why (why did they not get it?) or how (how might we reteach the skill in a way that sticks?).
What Worked For Us
We found the approach to data that we needed in Paul Bambrick-Santoyo's book Driven by Data. By creating our own interim assessments, which allow for data analysis by student, standard, and question, we finally could answer the why and the how. More than ever, our teachers could pinpoint what students knew and didn't know. This was powerful -- suddenly teachers had actionable information with which to plan teaching and reteaching for every single student.
Our Results
Bambrick-Santoyo argues that this model doesn't demand prior buy-in because the results themselves create the buy-in. It worked at our school. Over the course of one year, NMSA's overall proficiency on PARCC math tests went from 29 to 40 percent. On PARCC English language arts, proficiency went from 80 to 87 percent.
Yet the benefits of data-driven instruction go deeper than test scores. Our teachers learned that, across content areas, our students struggled with depth of knowledge. Now, our teachers are working to increase students' abilities to infer, see causal patterns, and universalize themes, skills, and principles -- the kind of critical thinking skills that we want all students to have when we send them off to college and beyond.
The below seven steps outline how we became a data-driven school:
1. Roll Out Professional Development
Let your teachers own their professional development process, particularly in creating their own interim assessments (see steps two and three). Part of the professional development training was learning the process. The other part was looking at case studies of schools that had achieved significant student growth through the use of data-driven instruction. We used the professional development materials that come with Driven by Data.
2. Determine Essential Standards
To successfully implement interim assessments, focus first on high-leverage courses. It's not feasible to include every course in the first year. We focused on courses in which there was an assessment tied to either graduation (end-of-course exams), college entrance (ACT/SAT), or both (PARCC): math, English, and targeted science and social studies courses.
To focus our assessments, we determined our essential standards, narrowing them from state and Common Core standards. We asked:
This helped us avoid the pitfall of covering too many standards without ever going deep enough for mastery.
3. Create High-Quality Interim Assessments
High-quality assessments are a prerequisite for getting high-quality data. We gave teachers a week during in-service work days at the end of the school year (after students had left for summer), and about 15 hours during hour-long faculty meetings to develop their assessments. A team of administrators and teacher leaders provided initial trainings on creating effective interim assessments (two hours). Once assessments were written, they met with each teacher for 30 minutes and used the assessment review tool in Driven by Data to help ensure that assessments met several criteria:
Our teachers borrowed questions from high-quality assessments -- from the same practice tests they referenced to assess rigor -- like the ACT. They also used resources like the Vanderbilt Center for Teaching -- along with trial and error and collaboration with colleagues -- to develop their own expertise in creating high-quality questions that pinpointed what students knew and didn't know.
4. Develop a System for Creating Data Reports
Without detailed data reports, data-driven instruction does not work. We put all of our tests on a learning management system (Blackboard) so that the data can be exported. One of our teacher leaders used the format modeled in Driven by Data and developed a method to turn the exported assessment results into an Excel spreadsheet that provided detailed information by student, question, and standard. This process is tricky and time consuming, so significant planning and resources must be devoted to getting it right.
5. Create a Process for Data Analysis
Analyze the results by student, question, and standard. This process is outlined in Driven by Data. Time for this analysis, and for the meetings to discuss observations, is essential. For the week after each assessment (at the end of the first, second, and third quarters), we cancel faculty meetings to allow teachers time for preparing their analysis. We then pair everyone with a fellow teacher or administrator for a 30-minute data-analysis meeting, held during teacher prep time either during the day or after classes. This provides an opportunity for rich discussion of what the data shows -- and what needs to be done next. We embed professional development in the process, as colleagues brainstorm instructional strategies that might be used for reteaching standards that were not yet mastered.
6. Follow Up on Reteaching
Follow up with your teachers on their reteaching. During collaborative teacher meetings and meetings with administrators, ask each other:
7. Reflect and Develop Buy-In
We believed what Driven by Data said about buy-in: It was not necessary from the outset because the results would build the buy-in. A year later, reflecting on the growth of test scores reinforced the effectiveness of the model. Further, it is intrinsically motivating when both teachers and students see how their efforts are leading to success.
How do you use data at your school? Please share your experiences in the comments section below.