Coaching with Data: Band Class

I always love a good challenge, and the one that was posed to me was, “How can you use data to coach me in music class?” – Challenge accepted!  I had a great conversation with a music teacher to see what they might be curious about and it seemed that the amount of rehearsal time was intriguing.

We chose to follow an Impact Cycle approach at the start because there were so many unkowns: what is optimal?  Are there different times of year where it is more/less?  What are those times?  How about middle school vs high school vs beginners?  We decided we would spend most of our time in the “Learn” phase for this project.

Together we did some research – which was oddly hard to do!  Was there an ideal already out there?  What does it look like in Middle School vs High School?  It actually wasn’t until halfway through our data collection that we found this model from the Music Educators Journal:

Rehearsal Time

It wasn’t clear if this was for a High School or Middle School course, or at what level – but it gave a target to be around half or more of class dedicated to rehearsal time.  Now to collect data…



We created a simple Microsoft Form that asked “Which Class…” and amount of play time. We also had to determine what would be counted as playtime (did it need to be full ensemble? Could it be tapping out rhythms?). And finally, we made some predictions.

Finally, I did some simple formulas to track individual classes and playtime over the observations.

Working this way, we discovered that certain classes play less, that over time all classes play more – and it inspired the teacher to look for opportunities to more frequently hit that %50 percent mark.

What’s great now, is that we could move into a more data-driven approach with our baseline data. The teacher could target an instructional strategy, we could measure again, and reflect on the results.

I’m looking forward to it.


Disclaimer: The data and graphics used on this site, unless otherwise stated, are simulated re-creations intended to protect the privacy of the original data sources.

Reflecting On a Year of Teaching Through Data and Student Surveys

At the end of the year, it’s easy to think of all that still needs to be accomplished, and all that wasn’t accomplished.  It’s the nature of teachers to focus on the work.  But it’s this time of year that I like to compare my student survey results.

Our school has a panel of student leaders that advocated to give teachers feedback twice a year this time – to allow us to learn and grow through feedback from our students.  As I wrote about previously, I like to share my results with the kids, but I’m careful how I display the data.

When I’m ordinal categorical data (data which has responses that are ordered like disagree-neutral-agree), my preferred visualization is a shifted & stacked horizontal bar chart.  Here is mine for my second semester student responses:

Screen Shot 2019-06-05 at 10.43.08 AM

Comparing that to my first semester…


And a few takeaways are clear:

  • In general, my bars shifted to the right
  • My disagrees and strongly disagrees disappeared
  • My Neutrals increased in the question about appropriate challenge levels
  • My strongly agrees decreased in a few areas

Again, I won’t share this bar graph with the students because I don’t want them to compare me to other teachers and I don’t want the students who “disagree” to think their small report size doesn’t matter, but I will share my progress and tell them some goals I have for next year.

It’s always important to me that I model how to receive feedback and demonstrate a growth mindset to my students – and this is a great time of year to use this activity to close out with students.


Disclaimer: The data and graphics used on this site, unless otherwise stated, are simulated re-creations intended to protect the privacy of the original data sources.

The Data of Poetry

As a data guy, some of my favorite work is when I’m partnering with master educators to gain insights into their practice.  One such master educator that I had the pleasure to collaborate with is Scot Slaby (twitter: @scot_slaby / blog: Noticing Poetry).  Scot is a published poet, high school English teacher, and advocate for raising the bar of poetry pedagogy.

Scot has developed and promoted a pedagogical technique called “Noticing Poetry.”  His technique is in good company, sharing similar strategies to Project Zero’s visible thinking routines, and fortunately for me, Scot had the foresight to collect data from his students… I was salivating to get my hands on his spreadsheets!

What we found was this: a student’s perception of their ability to engage with, and appreciate poetry grew with the “Noticing Poetry” technique to a statistically significant level.

Figure 5 - ExitSurveyResults

While this might not say that students are better poets or better at analyzing poetry, these questions give us insight into student self-efficacy around reading and analyzing poetry and motivation to do more.  Self-efficacy and motivation are two “power-tools” for increasing learning and upon reviewing the results, Scot and I co-authored a paper for the Journal of Inquiry and Action in Education which is available as a free download.

I wouldn’t have guessed that the marriage of data and poetry would work so well, but sometimes the best surprises – that bring back the joy of teaching – happen when you don’t know the endgame.



Disclaimer: The data and graphics used on this site, unless otherwise stated, are simulated re-creations intended to protect the privacy of the original data sources.

Measuring Mindset Mathematics

In Education, the 2010’s could be considered the “Mindset Revolution.” Carol Dweck’s work has permeated our pedagogies and her book and TedTalk are educational canon for any practitioner.  In my field of Math Education, Dr. Jo Boaler, has been revolutionalizing Mathematics Education with a “Mindset” approach.

I’ll admit it – I’ve been on board for a while.  Those of us who prescribe to this revolution believe that all children can learn Mathematics to the highest levels, that depth is more important than speed, that learning the ‘why’ and ‘how’ of mathematics is just as important (or more important) than learning the skills, and that maybe we need to rethink the importance of age old teaching traditions.

So has it been effective?  It’s hard to tell – so I sought a way to measure my own effectiveness through my journey into Mindset Mathematics.  I was a traditional teacher – with emphasis on rules and practice – until about 2010 or 2011 when I was tuned into Carol Dweck’s work and worked for a school that adopted the Common Core State Standards (and thus also adopted the philosophies behind the work).

With the only data to go on being standardized testing, the two years following my pedagogical shifts, I saw marked improvements:


Could this be a fluke?  One teacher at one school with a specific group of kids?  The only way to tell is by seeing if different teachers, at a different school, with different kids, saw the same change.

Fortunately – I became a data coach and Department Head at a different school in 2016 as their teachers made the transition to a Math Mindset pedagogy.  And guess what?  They saw the same change school-wide:


At this point you have to take notice. Variables such as students, teachers, schools, and year all changed – but when the same pedagogical practice was applied, the same increase in growth showed up (to make this more interesting, the country also changed, meaning even cultural influences didn’t change the pattern).

I’m excited to keep tracking this pattern.  Anecdotally I’ve noticed a difference for years, but it’s nice to know the data backs up my experiences for the past decade.  I encourage all educator’s to head over to and dive into the wealth of resources provided by Dr. Boaler for teachers and students.

Viva La Revolution!


Disclaimer: The data and graphics used on this site are simulated re-creations intended to protect the privacy of the original data sources.

Tracking Student Wellness

We are constantly measuring student learning data – but what about their well being?  We should be just as agile (or in my opinion more so) at meeting the social-emotional needs of students as we are to their academic needs.

Our team decided to use micro-surveys.  A set of eight questions, in four areas, delivered once or twice a month, to measure students’ perception of their own well-being.  The four areas we looked are:

  • School Enjoyment
  • Confidence
  • Climate
  • and Support

The fun part was designing the report.  First, I grouped the questions by domain and color coded them.  That way, similar questions would have similar colors.


Then, I created time series graphs for each homeroom teacher that showed the grade-level as a whole, their particular group of homeroom students, and how girls and boys answered.

Wellness - Aggregates.png

This allowed homeroom teachers to engage in conversations of “noticings” and “wonderings” looking for differences and changes in the data over time.

Lastly, the same report included individual responses, so that homeroom teachers can track how students that they see every day are self-reporting:

Wellness - Students

Some questions arose: How do we know when a kid isn’t taking it seriously?  Or answering honestly?

It was important to track our data and look for a few things:

  • If a student reported low, we needed to get 2-3 more data points to see if maybe they were just a lower reporter
  • We also needed to check that students self-reports had variety, which would show that they were answering the questions with thought

This data has been eye-opening.  When we talk about students, we share our perspectives, academic results, and their perspectives.  It’s the first time that student data from their eyes has become a part of our conversation.


Disclaimer: The data and graphics used on this site are simulated re-creations intended to protect the privacy of the original data sources.