When a school decides to implement Standards Based Grading, they also have to decide on a grading and reporting algorithm.  Will you choose depreciating averages?  Will you take the most recent grade only?  There is no right answer and schools are all over the map on this one.

My school decided to create a unique algorithm: tests assess standards on a scale of 0-4, then our gradebook only takes the most recent 3 assessments for each standard and averages them rounding to the nearest whole; then each reporting strand is an average of the nested standards – rounded again to the nearest whole.

A constant fear with this choice, is that the two stages of rounding has led to grade inflation.  To explore this, I used the gradebook of all 6th grade math students and created variables of the assessment averages vs the reporting strand averages.

Here’s what I found:

Summary-GradeSummaries

AMean = Assessed Mean ; StdMean = Standard Mean ; StrMean = Reporting Strand Mean

Looking at the variable $A.Str.Diff it’s nice to see that the mean is low.  This variable calculates the difference between the Assessed Mean and the Reported Strand.  However, it’s important to note that at least one student is getting a 2/3 bump and at least one is losing half of a gpa point.  In our high school – which has aggregate course grades – this would be an entire letter grade.

We can also see that the majority of this rounding error happens in the second stage of our algorithm which is illustrated by the $Std.Str.Diff variable.  Although, also note here that the lowest quartile – the students who are actually hurt by the rounding error – are hurt less during this phase of the algorithm.  This would imply that the second rounding benefits everyone more positively than the first step of the algorithm.

Grade Summary Correlation values

Looking at Correlation values, it doesn’t seem that if a student has a worse test score, they benefit from greater rounding.  Instead, it seems that if a student is able to do better at the standard level, they would also benefit from more rounding.

Rcode to produce the plot below

Rcode to produce the plot below

Rplot

Here’s just a quick qplot showing reported strand mean to rounding error.  What we see here is that if your grade is higher, its more likely you benefitted from positive rounding.  However, if your grade is lower, you more likely benefitted from negative rounding.

So does inflation exist?

Likely, but also with some deflation for the students that perform worse.  I’d like to aggregate the data into quintiles or other categories to see if certain demographics are more affected.  Importantly to note that if there was no inflation or neutral inflation, we would have expected our Means and Medians to be zero, however that’s not the case.

Even though 0.1 seems insignificant, when a 2.4 student gets a 0.1 bump, they become a 3.0 student at our school.  And the reverse is true as well.  I believe our ideal goal with Standards Based Grading is to give a more clear picture of our students and our algorithm may need a bit of tweaking.

Signature.png

Disclaimer: The data and graphics used on this site are simulated re-creations intended to protect the privacy of the original data sources.

Pin It on Pinterest

Share This