Wood letters of SWOT analysis and definition on wood background

Targeting Opportunities for Our School Community to Learn Together 

By Jacy Ippolito, Joshua Lawrence, Joanna Yau, Judy Liu and Rachel Strumpf with support from the Hellman Foundation

According to the Society to Improve Diagnosis in Medicine, diagnostic error causes an estimated 40,000 to 80,000 deaths annually, despite the fact that diagnosis research receives relatively healthy funding in the medical sciences. Educational leaders often see the symptoms of troubling practices in their schools, but diagnosing the underlying causes and providing appropriate “treatment” is challenging. Most school-level decisions are extremely complex and do not have clear indicators of success or failure. Good decisions result in modest improvements in some areas without compromising performance in another. Perhaps more importantly, how a decision is arrived at can strongly influence the success of that decision. For example, when we hope to identify leverage points for improving instructional practice, how staff contribute to the decision-making process can strongly influence how successful professional development is likely to be. At the same time, arriving at the “right” decision is important. We want to be sure when we embark on a program of professional development, we are selecting the right focus, instructional program, or intervention.

The SERP Institute (Strategic Education Research Partnership) developed CALS to assess the literacy dispositions and habits of students and teachers (with support from the Carnegie Corporation). After being used in several research contexts, SERP put the survey online to allow teachers, principals, and/or school district representatives to register and administer the survey digitally. Reading Ways uses the survey with its partner schools, and had made a freely accessible version of it available through Google forms (it is available here).

An Example of CALS in Action

To illustrate how school leadership teams are beginning to use dynamic, online diagnostic tools such as CALS, we turn to a case example from Hudson, Massachusetts. Hudson is a semi-rural town, home to over 17,000 people, and is a metropolitan suburb of Boston (40 miles west of the city). The town supports three elementary schools (preK-4th graders), one middle school (5th-7th graders), and one high school (8th-12th graders). District and school leadership began their investigation of literacy diagnostic tools for teachers and students in 2011, knowing that their teachers and students might need additional support in meeting the challenges of CCSS. As a first step toward increasing students’ literacy achievement across the board, a team of 17 educators (teachers and administrators) came together to form a Literacy Action Team (LAT), which then began reviewing existing sources of student data. The team examined state standardized test scores, SAT, PSAT, and Advanced Placement test scores for graduating students; however, this quantitative data was not helping the LAT better understand how secondary teachers in Hudson were supporting their students’ deeper reading, writing, and communication skills. The team quickly realized that they needed more information from both teachers and students about their collective understanding of literacy instruction within and across content areas. The LAT began designing a survey to administer to teachers and students, and in their search for existing templates, they found the CALS. After contacting Dr. Joshua Lawrence and chatting about the possibility of using the CALS with a high school group of teachers and students (a small shift from the survey’s original target population of middle school teachers and students), the Hudson LAT agreed to administer it.

LAT members reported that using the CALS was an easy decision because “it was already on the computer,” and that “it actually didn’t take teachers and students as long as we thought it might, because we were really sensitive to that.” Todd Wallingford, the Hudson Curriculum Director for English Language Arts and Social Studies grades 6th-12th (and one of the primary conveners of the LAT) commented further: “We didn’t want anything that was going to take more than 20 minutes… and it didn’t!”

The Hudson team administered the CALS during the winter of 2012 to 185 students and 88 teachers across grade levels and content areas. The LAT was excited to receive immediate feedback on questions they had been wrestling with, for example hearing from students “How much and what kinds of reading and writing do you do?” both in- and out-of-school. Similar data was collected from teachers about their literacy practices in the content areas, and importantly, the LAT was able to gather data about previous professional development experiences focused on literacy. The results of the CALS were illuminating because the survey data helped confirm and quantify a number of suspicions the team had already identified from other data points: that a majority of the secondary teachers had not been given access to literacy-specific professional development; meeting the literacy needs of students with special needs and low literacy levels was challenging; and that students were finding it challenging to draw inferences when reading. Several new findings arose as well. Teachers reported spending a fair bit of time teaching vocabulary, but they also reported that students were not demonstrating strong understandings of academic and subject-specific vocabulary. Meanwhile, students were reporting not enjoying learning new vocabulary in the subject areas, but they recognized its importance in understanding course content. Such findings were important because they created opportunities for faculty to talk and collaborate with one another as part of district-designed, targeted professional development.

Data Driven Dialogue Protocol used by Hudson High School team

Looking at the Data as a Team

Perhaps more important than any one piece of data or particular finding was the process that the LAT underwent in analyzing and reporting the data. The team engaged in multiple data analysis sessions, meeting as a large group and then as small sub-groups to explore different aspects. Notably, the team created a discussion-based protocol (see above; a pdf version is here), adapted from the (Data Driven Diaglogue Protocol from the School Reform Initiative). The protocol allowed the team to explore their own assumptions about teachers’ and students’ literacy practices, make nonjudgmental observations about the resulting CALS data, and then craft evidence-based inferences.

The power of working in such a collaborative and deliberate manner is clear years later. The school and district have used this data to spur several waves of district-designed professional development addressing areas such as inferencing, vocabulary instruction, and refinement of disciplinary literacy practices. Study groups and summer institutes were formed and led by LAT members and other expert content-area teachers and leaders. The study groups have been reading Doug Buehl’s excellent book, Developing Readers in the Academic Disciplines, and in response teachers have been collaboratively designing instruction within and across content areas to address achievement gaps.

After the administration of the CALS, Todd Wallingford reports that while the LAT does not often revisit the CALS data, the process they underwent administering and analyzing the data was pivotal in raising areas of challenge and confirming suspicions about areas of need. The idea of looking at data collaboratively built the faculty’s and leaders’ capacity to tap local expertise and provide targeted professional development efficiently (as opposed to blindly choosing new, expensive curricular packages).