The problem with patterns

The problem with patterns

David Morgan

Correlation and Causation 

When presented with numerical data our brain, and secondary education, will seek out patterns and then, if required, plot and graph out the relationship. By the end of our school years we will do this with relative ease and confidently share conclusions that this data shows.

However just because a clear pattern emerges – have we satisfactorily established CAUSATION or simply identified a CORRELATION. Unfortunately the uncomfortable truth is that we never really understood the difference.

In his 2006 documentary, An Inconvenient Truth, Al Gore stood next to a huge graph that tracked the temperature of the earth and the concentration of carbon dioxide. His conclusion was that rising CO2 had caused a rise in temperature and the resulting climate changes. Over the next few years the counter-argument insisted that their was an undeniable correlation between the two variables but no causal link – indeed some observers argued the relationship was the inverse with temperature causing carbon dioxide rise. While this argument has since been resolved by the IPCC – the problem of causation and correlation remains.

Data in public policy is even more complex to analyse and the problems with rushing to judgement on the link between two data sets is commonplace. This is best exemplified by the predictions regarding inner city crime in the US during the nineties. Politicians who made predictions of a crime wave linked to poverty, unemployment and drug use were in abundance. However explaining the eventual drop in crime rates proved equally elusive. This is famously covered by Dubner and Levitt in Freakonomics who establish a clear relationship between abortion rates and crime.

The question for educators is whether we condition students to assume all relationships are causation without really considering correlation. 

Many years ago I would have students investigate factors that would affect how long a candle would burn when covered by a glass beaker. Most would vary the size of the beaker and find that the bigger the beaker the larger volume of air would allow the candle to burn longer. Occasionally students would vary the size of candle and would struggle to come to a conclusion. Every once in a while a student would suggest the colour of the candle – as a caring teacher I would discourage them from this path. Was this the right thing to do? I now question that decision. I have no idea whether the coloured compounds in the wax has any effect on the rate of combustion. I know that there will be data and I know that there could even be a possible pattern in that data. Establishing whether that pattern is causation or correlation, however, would be a much greater test of a student’s scientific knowledge than the pre-determined and well known causation that the rest of the class has found.

“Just because two variables have a statistical relationship with each other does not mean that one is responsible for the other.” -Nate silver

No doubt we are teaching our students to collect and analyse data, but are we being too safe in our choice of experiments and investigations. By using the same old experiences we are conditioning them to expect and assume causation. Should’t we be having them look at data that is less obvious and have them argue over causation and correlation? There won’t always be a clearcut answer but isn’t that more like real life anyway?

David Morgan is an international teacher and administrator currently based in Switzerland  

Gore, Al, Davis Guggenheim, Laurie David, Lawrence Bender, Scott Z. Burns, Jeff Skoll, Lesley d, Bob Richman, Jay Cassidy, Dan Swietlik, and Michael Brook. An Inconvenient Truth. Hollywood, Calif: Paramount, 2006.

Levitt, Steven D., and Stephen J. Dubner. Freakonomics. Harper Trophy, 2006.

Know Thy Students, Know Thy Needs: Advanced Survey Techniques Made Really Easy

Know Thy Students, Know Thy Needs: Advanced Survey Techniques Made Really Easy

We know that many students around the world have suffered physically, academically, and mentally as a result of prolonged periods of COVID-related lockdowns.  But, to what extent, is this true?  What factors might be associated with this extent?  

I have used Qualtrics to help answer questions like these at my school. 

Qualtrics provides advanced, yet easy to use, survey-based data services for educators wishing to make smarter decisions and who want to glean novel insights into their school community.

Qualtrics is great for many reasons but let me share just two.  The beauty of one feature, called ‘Crosstabs’, is that it allows you to reconcile results from two different questions.  For example, I recently surveyed my students on a variety of topics related to remote learning.  With two particular questions, I wanted to know, first, the extent to which students believed :

  1. the volume of time spent in lessons and working on assignments during remote learning had increased, decreased, or stayed the same; and,
  2. their mental health during remote learning had also either increased, decreased, or stayed the same.  

The results received for each question were pretty straightforward: respondents generally reported increases on both measures.  Just as nice is the ability to see how each Grade answered such questions, as show below:

Clearly, most students in each Grade reported a general increase in the amount of time spent in lessons and on school work during remote learning as compared to on-campus learning.  However, I remained interested in knowing the percentage of respondents above who ALSO reported an associated change in mental health challenges.

Using Qualtrics’ Crosstabs feature, I was able to explore a relationship between volume of work and mental health.  The results were pretty interesting, to say the least.  

This means that, of those who said that the amount of time spent in lessons and on school work since remote learning began ‘has increased a lot’, 61.3 percent also said their mental well-being since remote learning began ‘is much worse’.  It is also possible to categorise these results by respondents’ Grade. 

The realisation that about 40 of my students needed immediate attention kickstarted a flurry of meetings and discussions that have included students, parents, faculty, and board members.  Of course, worsening mental health does not mean poor mental health.  It does mean that these students are doing ‘less well’ than they should be.

In the short term, these findings prompted the formation of a Mental Health Task Force consisting of students, parents, faculty, including counsellors.  Our counsellors can now make more informed decisions when prioritising their time.  In the medium-term, we can compare changes in responses year-on-year. In the longer-term, findings can help with decisions related to resource allocation, staffing, and budgeting.  

However, how would I have known to take these actions and considerations without using Qualtric’s Crosstab feature?  I cannot overestimate the value of it.  Still, Qualtrics has many more.  Let me give you one more example.

I was also interested in learning how much physical exercise my students were getting during a recent, extended period of remote learning. I first looked at the following simple results:

This indicates that students were not getting a reasonable amount of daily exercise at the time. However, I also wanted to know the relationship of these responses to students’ self-declared sense of happiness.  I suspected that students getting the most exercise would also report relatively higher levels of happiness.  I used another Qualtrics feature called ‘Stats iQ’ to test my hypothesis.  All I had to do was simply drag the results of each question into two available analysis fields.  Doing so gave me the following results below. 

Not sure what this means?  No problem: Qualtrics makes it easy.  By hovering over the shaded circles with the letter ‘i’ in the middle, easy to understand explanations for both P-Value and Effect Size are given.  The P-Value explanation given tells me that I can be quite confident in the reported Effect Size’s validity.  The latter tells that the effect of daily exercise on student happiness is not large; that it has a ‘medium’ effect.  Or, in other words, student exercise at the height of my school’s lockdown, indeed, had a noticeable effect on their happiness.  It would be fair to say that my hypothesis has some truth to it.  Based on this analysis, I ended up letting students, parents, and teachers know in various contexts that students getting daily exercise at the height of our most recent lockdown were adding benefits, not only to their health, but also to their happiness.  Has this enabled parents, and students themselves, to make better decisions?  You betcha!

We can make data analysis as complicated as we want.  But it can be quite simple, too.  Qualtrics just tells you what you need to know.  It has many other features which I may share in the future but I hope this article has piqued your interest.  Full Disclaimer: I receive no financial benefit for touting the benefits of using Qualtrics.  But maybe I should!  Nonetheless, I would love to learn more about how other educators are also using Qualtrics to improve decision making and experiences for their students, parents, and teachers.  Get in touch with me!

By Timothy Veale (Secondary Principal at International School of Hamburg)

Pin It on Pinterest