A few weeks back, I posted the summary of the fall IPI-T data as well as a bit of context on what it is and how it fits with our district strategic plan. The winter IPI-T data has been collected and aggregated for the winter observation period. Here are a few of the highlights that jump out to me when looking at the district as a whole:
- This data is based upon over 1,388 distinct classroom observations. This is up from 1,173 observations in the fall. A larger sample size is generally more representative.
- We fall short of our goal for 40% of students engaged at level six or level five. We hit at 24.9%. However, this is up from 1.2% from the fall data.
- We showed 1.1% students coded at “Level 1” or disengaged. This is down from 1.8% in the fall. Reducing (ideally eliminating) level 1 will have the most impact on overall student achievement.
- In the fall data we 406 of the 1,173 observations (roughly 35%) indicated students were using digital tools. In the winter 435 out of 1,388 (roughly 31%) indicated students were using digital tools.
I hope that each of you has had the chance to look at and process your local building data. Analyzing and adapting practice based on the data is why we have this indicator in our district strategic plan. Like all data, IPI-T is inherently empty of meaning and value until teachers at the building/department/team level “make meaning” from it — ie… ask questions and make changes in practice based upon the input. If you’ve not had a chance to discuss this in a PLC, I would strongly encourage you to do so at some point before the end of the year.