top of page

Being data driven, data curious, and data skeptical (first of a series)

My old boss would start every review meeting with a simple question: “So, what have you learned?” This blog series is an attempt to answer that simple and potent question for District Public, and in the process offer some useful insights for school leaders.

Som and I began with a notion that the challenges we faced and the problems we solved in the private sector – myself in marketing at IBM and Som as an financial analyst at Lehman Brothers – were analogous to those faced by educators working in New York City district public schools in high needs areas. With the encouragement and support of a number of dedicated school leaders in the Bronx, most notably Will Frackelton, Principal of Soundview Academy for Culture and Scholarship, we started work in March 2014.

Since then we have worked with 31 district public schools, primarily middle and high schools in the Bronx and upper Manhattan. We focus on helping these schools use their data more effectively, both to improve instruction and to strengthen ties to stakeholders, starting with the data they already have. We provide custom analysis of this data – the biggest piece of which is their annual state assessment results (see an example here, but also includes data on unit test results, attendance, teacher ratings, school survey results, and more. We then provide professional development services to help teachers and school leaders better understand the data and identify actions they can take from it. We also help school leaders use data to tell their story to parents, teachers, administrators, and the public. We estimate we’ve spent over 400 hours meeting with school leaders, administrators, teachers, and educational partners.

In the process we’ve learned a tremendous amount from these dedicated education professionals – and not just about data. Our original notion – that the challenges these professionals face are not so different from those we faced in the private sector – has proven correct.

So what have we learned? What advice can we offer other district public school leaders working in high needs areas?

This post will cover what we’ve learned about how school leaders can best use data. Future posts will cover additional topics – how school leaders can distill their school vision, goals, and strategy into a simple and memorable “story”, leverage their investments by focusing on operational basics, and prioritize principal and peer coaching for teachers.

Be data-driven, data-curious, and data-skeptical

One principal expressed amazement when we presented her with our analysis of her school’s state assessment results – a data packed report that breaks out aggregated and individual student results by standard, question type (i.e. multiple choice and extended response), and individual question. Her 6th grade scored highest on Ratios and Proportions and the Number System – two groups of standards that together represented 44% of the 2015 New York state assessment. The same students scored lower on Expressions and Equations, a group of standards which alone represented 35% of the assessment. Her conclusion: adjust the scope and sequence of her 2015-2016 6th grade math curriculum to allocate more time to Expressions and Equations.

Yet when we began to look at results on specific test questions we found something surprising – students performed poorly on a relatively simple question testing their ability to divide fractions. The most common incorrect answer showed that students were mistakenly applying the rule for multiplying fractions. Until students could reliably divide fractions, an added emphasis on Expressions and Equations simply wouldn’t pay off. The new conclusion: ensure students can reliably execute basic operations involving fractions before moving on Expressions and Equations.

Like many of the leaders we worked with in our private sector careers, nearly all school leaders we work with are what we would call data driven and data curious – they seek to use data to inform decisions, and are always asking what else they can learn from their data. Yet like we’ve seen in our private sector experience, school leaders sometimes fail to be what we would call data skeptical. In other words, they fail to probe deeply enough into the limitations of the data they have on hand - what caveats need be applied, what further digging needs to be done to validate an initial conclusion.

We think school leaders can be data driven, data curious and data skeptical by following a few simple principles.

Recognize the limitations of existing data. Leaders of all stripes are just as likely to read too much into data as they are to pay too little attention to it. In my experience in marketing, for example, it’s common for managers to ask for key statistics like the number of visitors to a website without recognizing its limitations. A manager might conclude that a website aimed at selling technology services to large businesses that has 10,000 visitors per month is more valuable than the one that attracts 5,000 visitors. But without knowing who those visitors are, such a conclusion is premature. If nearly all the visitors to the first site are students, non-technology professionals, or even the company’s own employees! - none of whom are in the market for business technology – then no amount of visitors will help the business. If more of the 5,000 visitors to the second site are in the market for buying technology for businesses, then the second site is probably more valuable. Looking at the number of visitors alone would lead to the wrong conclusion.

The same principle applies to school leaders. While the state assessment data is a rich and powerful source of data that can guide school leaders’ decision making, it must be treated with caution. A school leader seeking to group students for differentiated instruction, for example, might rely on their state assessment results to do so. It’s tempting – the data is all there, and it’s in a usable format. But there’s a few catches. The data reflects where students were in April of the previous school year, not where they are now. Questions that are tagged with one standard may actually be testing a range of skills and standards, muddying the view of how well students understand a given standard. Teachers and school leaders need to analyze results on specific questions on the test to draw better conclusions about what to focus on. Or better yet, draw on results from more recent assessments.

Quantitative data alone are rarely sufficient to offer clear answers to the thorniest questions. While data can inform and guide, the big decisions are ultimately judgements based on experience and intuition. School leaders need to maintain perspective on what data can and can’t tell them.

Ask yourself what actions you would take differently, if you had the right data. It’s easy to get bogged down in data analysis that is interesting but not actionable. There are thousands of questions you can answer with the data you have on hand – but what good is finding the answer if there’s no action you would take differently as a result of knowing the answer? With an in depth analysis of your school’s attendance data, for example, you could answer any number of questions. What days of the week had the lowest attendance? How frequently were students with IEPs late compared to those without? But what actions would you take if you determined, for example, that IEP students are late more often than their non-IEP peers? Would you call the parents of every IEP student? Probably not – unless there’s a reason to approach parents of IEP students differently than parents of non-IEP students, a school leaders tackling lateness would probably want to call parents of all chronically late students, not just IEPs.

One school leader we worked with identified the top 15 highest absentee dates in the last school year. He identified some clear patterns – the dates were all just before or on holidays (the day before Thanksgiving and Halloween day, for example) or at the very end of the school year.

Because the school already had a positive behavior incentive system (PBIS) that rewarded good attendance, the school leader was able to create an additional incentive for students to attend school on those days. The analysis provided him not just interesting information, but information he could act on to make a difference in a key goal – improving attendance.

We think school leaders can weed out a lot of wasted effort by keeping their data inquiry focused on informing actions they can take. Often, the simplest analysis (like the attendance analysis described above) can have the highest impact if it’s focused on a school leader’s major levers.

Invest in a common system for administering, grading and analyzing unit tests. A critical set of decisions teachers and school leaders need to make in the course of a school year is when to review and re-teach specific lessons, standards and skills, and when to move on. To answer these questions school leaders and teachers need to be able to easily analyze the results of interim assessments such as unit tests.

In one school we work with, the entire Math team uses a common platform and calendar for administering, grading and analyzing unit tests. Every six weeks, the Math team comes together to review their students’ results on the most recent unit tests. Because the tests have been administered and graded using a common digital platform, analyzing the results is straightforward. Teachers are able to see which questions students got wrong, and which were the most common incorrect answers. Together teachers identify what students failed to understand and identify key lessons to re-teach. For example, in one analysis of an 8th grade unit test on power rules, teachers identified that students made a common mistake when multiplying two numbers with the same base and different exponents. Rather than add the exponents, they multiplied them (i.e. students answered that 2^2 x 2^3 = 2^6, rather than the correct answer, 2^2 x 2^3 = 2^5). The teachers agreed to re-teach that lesson and administer a quiz to test the results, before moving on to the next unit.

We think school leaders have an opportunity to use data to more effectively focus instruction, by investing energy in adopting and enforcing the use of a common system for creating and grading unit test results. Key is that the platform enables schools to easily export item-level results in electronic format such that they can be easily analyzed. Specifically, schools need to be able to see each question, how each student answered each question, the standard the question tests, and the correct answer to the question.

There are a number of platforms that schools can use to accomplish this: SchoolNet and Datacation are two that are commonly used in New York City public schools. But just as important as finding a platform is making consistent use of it. In our experience creating a common process for administering unit tests that works for teachers, and enforcing its consistent use (more on that in a future post) require a significant investment of time and energy.

We hope this has been helpful to you, and would love to hear from you! Please leave a comment and tell us what you think. What data practices do you think are important for school leaders?

In our next post we’ll share what we’ve learned from school leaders and our own experience in the corporate sector about setting and communicating vision and goals to school stakeholders.

Featured Posts
Check back soon
Once posts are published, you’ll see them here.
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page