Wednesday, March 6, 2013

Correlation Does Equal Procrastination

The panic spike is an observable phenomenon in the analytics of websites hosting academic or other study-related material for high school students, particularly second semester senior students. The term "panic spike" is a bit of a misnomer, it is actually a cluster of visitor spikes that occur just prior to and on the dates of major exams.

These events taken on their own, with no other supporting data, can be difficult to interpret. It could simply be an indicator of last-minute study, a brush up or skimming of the notes before the exam.

By itself, the data may suggest but not prove lack of preparedness in students. Correlation does not equal procrastination. However recent traffic spikes like the one pictured above are more pronounced that spikes observed last semester. That in itself is suggestive, but any diagnosis of senioritis would require an examination of traffic data combined with exam performance data. That additional information however is above the classification of this blog and will not be reported here. It is our hope that the above reported information will be combined independently by individuals with access to their own grades, and thereby offer appropriate encouragement. 


A more old-school performance metric.
I posted the above text on my economics notes blog, the place where I host all notes for lectures, videos, and anything else I do in class. My reasons for using that platform have already been explained on this blog, so I won't get into it now. But this semester has given me second thoughts.

It's not that I am afraid they are using the site to cheat, but the knowledge that nearly all of the material covered in class is available online may have lead some of my students into a false sense of security. My initial thought was to pull down the website just prior to exam time, but that would only encourage them to make copies ahead of time and thereby procrastinate by a nearly identical method. I want to keep the site up, I think it does genuinely help. But perhaps stronger use guidelines in the classroom are appropriate.

In any case, it's fun to see this kind of data in the classroom. Imagine if we could get detailed data on textbook use and engagement. Without, of course, giving reading quizzes to make sure they do what you told them, which has the side-effect of padding their averages (and wasting time).

I wonder if, once the transition to e-textbooks eventually takes place, Google-style analytics on their use will be available to teachers?

Just a thought...