When this all began, we wrote about Surveying Students to gain input into their experience as a distance learner. We simply wanted to know what was working and what was not; to be agile and change. But what if we missed something? How can we dig deeper and uncover some of the sentiment or emotional value that was offered in their responses? I am convinced, that we as educators must look back at this time and uncover the emotions of our students responses.

Survey’s are extremely valuable in gathering Perception Data, to help us understand the user’s perception and preferences of their experiences. Traditional ways that educators might analyze these free-response survey results are:

  • Read through each result and mentally take note of what resonates
  • Sort responses into “Big Ideas” or sort into Action Items
  • Similarly, perform a Thematic Analysis if the educator has experience in Qualitative Data Analysis (or “QDA”)

This is a great start. If you can, reading all the responses provides tremendous insight. Unfortunately, I am human, and as such am prone to having “blinders”. I might over-emphasize something that pulls at my heart strings, is easier, or meets my personal perceptions. Many of you recognize this human error by the term bias. I can perform a potentially deeper dive, with less bias, by using a skill called “text mining.” I’ll go through a couple examples and share what I’ve found:

Viewing Word Counts

The first analysis we can do is looking at word counts: the frequency a word appears in the responses. I had recently changed my course layout to incorporate more choice-based exercises and wanted their feedback. Here are the results when students were asked “Do you prefer the new format?” and “Why or why not?”

I can see that words like “choice” and “online/offline” – which matches the type of work they had the choice to do – echoes what I was aiming for in the new format in the responses of students who see the change as favorable. For students who preferred the old format, words like “information” might hint that too much choice has added a lack of clarity or uncertainty.

Analyzing the Emotional Valence of Responses

A second thing we can measure is the sentiment or emotional valence. This is done by taking all those individual words from the responses, and correlating them to a lexicon that gives us something we can measure. Here are just two examples:

First, I looked at a numerical Emotional Valence using the AFinn lexicon which rates words on a scale of -5 to +5. Here is a plot of how positive/negative my students responded to the new choice-based format (that we looked at previously) and specifically a new type of lesson called “Extension, Reinforcement, Catch-Up”

What I gather from this is that the new “ERC” lessons evoked more positive reactions, while changing the overall format elicited a more mixed response. We can take it a step further with the NRC lexicon to parse out specific emotions evoked through student responses:

My teacher-heart is warmed that “positive” and “trust” are the prevailing emotions during this time of uncertainty. Looking at “anticipation” and “fear”, it might be that the bigger change in course format evoked more words with those emotions or it may be a reflection of the paradigm we find ourselves within.

We Need To Look Deeper

Looking at these responses, I have so many questions: Would trust be high in other surveys? Would anticipation and fear change over time? How can we use this data in SEL and wellness activities to make sure we are attending to what students are feeling? What can we learn about how our students react to such sudden changes and how can we use that to inform future preparation?

The time we are in is profound. I’m hoping I can analyze more, to understand more deeply the experiences of our community, and to be as responsive as possible to student emotions in the future.


Pin It on Pinterest

Share This