Age Relativity

Have you really gotten that much “older” in the last ten years?  I sure feel that way.  At 27 I’m utterly embarrassed by what I remember of my 17-year-old self.  What’s more is that right now I have an intrinsic sense that I’ve reached my full mental maturity.  Details may change but the person that I am is set in stone, that’s my feeling.  But I know that I’ve had that feeling before.  I’ve probably had that feeling my whole life, and at the same time I know that I’ve had many life-changing experiences, both sudden and gradual, along the way.  In an interview that I heard recently someone said that at 32 they were young and reckless, implying that at their current age (I don’t remember who it was but I want to say they were in their early forties) they had finally grown up.  This comment sparked a flurry of thought in me.  First I was ecstatic, then frustrated, then skeptical.  Ecstatic because it meant that maybe I wasn’t done “growing up,” that I could still re-work my vices and virtues to become a wholly better person (with at least 5 more years to be as young and reckless as I want).  Frustrated because I didn’t want to wait until my forties to finally be a capital “a” Adult.  Skeptical because a bigger question arises.  Why do we treat our past and future selves as something “else,” obscured by a fog that surrounds our present?

I remember thinking about the “older/younger” relationship quite a bit back in my K-12 years.  It was confusing that to a 7th grader, the 8th graders seem so much older, but when I became an 8th grader I didn’t feel any older.  And college freshman, who seemed like they might as well have been my parent’s age when I was in high school, seem like kids to me now.  So this “age relativity” that permeates our thinking is apparently full of contradictions, and this might be a mentality worth changing.  This 7-minute talk by Dan Gilbert that was recently featured on the TED Radio Hour podcast shed a lot of light on the issue for me:

In my immense self improvement to-do list, “take a broader view of time” has been hovering near the top for awhile, and this talk gives some concreteness to the task.  I think it’s incredibly valuable to try and push back that fog surrounding the present, so that I can relive memories more vividly and more fully grasp where my present is taking me.  Some ideas on how to do that – keep a journal, take the time to remember the details of my favorite meal of the last week, make detailed plans of how I’m going to meet my next research deadlines.  That last one depends on me finishing this and getting back to work, so at this point I’ll ask all zero of my readers to chime in.

What is the value of a grade?

For the past few years I’ve been both getting grades as a graduate student and giving grades as an instructor.  As I toil away for hours on producing my homework and grading my students’ work, I often wonder what exactly all of these grades accomplish.  Clearly they must be important, or so many people wouldn’t be putting so much effort in to sorting the A minuses from the B pluses.  Yet with the well-documented if not well-understood phenomenon of grade inflation (fig. 1), the “information” contained in a grade seems to be waning.
 
Figure 1 – “A” has been the most common grade since 1996.  Note the first inflation period from about ’66-’72, so this is not a new phenomenon.  Also note that “B” and “F” percentages have stayed effectively constant.  source: http://www.gradeinflation.com from their 2011 report, analyzing data from ~200 colleges.
 
I want to make a few points about the concept of “grade information” – what information there is in a grade and who uses that information.  My plan is to have the discussion continue in the comments, which is weird because nobody actually reads these yet but I’m thinking that if I build it, they will come.
 
So what do I mean by “grade information?”  In the context of the trend in fig. 1, up until the mid 60’s an A average meant that a student was generally in the top 20% of their classes, but now it means that the student is in the top 40%.  Information, then, has been lost in the sense that there’s no longer a way to sort out the top 20% from the top 40% using only GPA.  On the other hand, a B student has gone from being “not top 20% but better than 50% of their class” to being “not top 40% but better than 20% of their class.”  I would argue that the “amount” of information in a B grade has stayed constant, though the content has changed drastically.  One last comment is how F has meant “bottom 5%” consistently over time, and D got a slight reworking from “between 15% and 5%” to “between 10% and 5%”.
 
At the risk of oversimplifying things, I see a trend toward “binary grading,” where assigning grades is basically a matter of sorting out the very lowest percentages of students while letting the rest of the students “through” with A‘s and B‘s.  That certainly is in line with my own experience.  As a student I shoot for A‘s but I generally don’t fret if my grade is at least a B.  As an instructor I virtually always curve the final grades and I pay particular attention to how I assign the B/C cutoff, taking into account my interpretation of whether the students above the B line had shown overall comprehension of the material in their homework and exams.
 
Furthermore I think that “binary grading” was likely the case even in the “wear a suit to college” era of the ’40’s-60’s.  The trend, then, is from a C grade being the cutoff to a B grade being a cutoff.  Many of us inexplicably carry around the idea that C is the average grade, even though that hasn’t been true for decades.  I think that the trend toward a B cutoff has been driven, at least in part, by the near-ubiquitous requirement for a minimum of 3.0 GPA in order to keep in good academic standing (both in undergrad and grad programs) and be eligible for most applications to grants and graduate programs.  If you assign a B grade to a student, you’re saying they should move forward.  I suppose that’s what a C used to mean.  If the students these days do noticeably better than “just enough to move forward,” then give them an A, and why not!
 
I went into this article thinking that I would lament the grade inflation trend, but now I applaud it.  Mostly because I have plenty of firsthand experience with students who genuinely understood the course material but disproportionately under-performed on coursework, particularly on exams.  It could be that the C+ student did comprehend the material at least as well as the B+ student, but for whatever reason the student couldn’t convert comprehension into a good grade.  We can all rattle off a dozen examples of why that could happen, maybe the student was sick and missed a week of class or maybe the student’s brain doesn’t shine during exams.  We can’t assume that there is a direct correlation between grades and comprehension, though I would like to see the percentage of B grades go back up and the grades going back to the top 20%, regaining that potentially valuable grade information.
 
So who is it that values grade information, and who else should care?  I see four players with a stake in the grading process:
  • The student
  • The instructor
  • The school
  • The transcript checker: a catch-all for prospective employers, graduate schools, etc.

Each player has their own incentives but they generally want the same thing, which is high grades for students and/or to recruit the students who get high grades.  I might have more to say on this in later posts, but for now I’ll pass it to the crickets in the comment section.

Meta Post

So Fall semester 2014 is starting up.  When you’re a career student like me (I’m a 21st grader), the start of school is the real new year, and this time around I have some bona-fide resolutions.  In addition to the usual goals of reading more papers and wasting less time on the internet, I plan to start posting blogs every Tuesday.  I have no plans beyond that, so I’ll just write what I feel and after a few months I’ll see where I’ve ended up.