Recently I found myself in several meetings discussing 'learning analytics'. Basically, we want to identify potential data sources that will help inform our decisions around retention, student success, advising, placement and a plethora of other student-centered topics. The Chronicle just released a piece
on learning analytics, citing examples from Harvard to Rio Salado College, a community college in Arizona.
Regardless of what lens I view learning analytics through, I see incredible opportunity to better guide and support our students. From an institutional research perspective, I think we can use these analytics to enhance things like retention and advising. From a faculty perspective, I can see using analytics to increase engagement in my classroom. Being part of a small committee looking at potential new CMS platforms for Penn State, I'm thrilled to report that all of our potential platforms have a wide variety of learning analytics modules.
While I feel this is an extremely positive movement, Gardner Campbell, director of professional development and innovative initiatives at Virginia Tech, has a different take (from the Chronicle Article
"Counting clicks within a learning-management system runs the risk of
bringing that kind of deadly standardization into higher education."
The article summarizes Gardner's concerns, pointing out that these CMS environments are not necessarily the best platforms to measure real student engagement and creativity. I wholeheartedly agree with Gardner! This could be a slippery slope some universities could go down. But I do argue that counting clicks is an important piece to guiding
decision making in terms of retention and student success.
Take Rio Salado for example. I attended a webinar by their project lead, and he reported that a large amount of the variance in terms of student success (a "C" or better) can be predicted by using two variables from the CMS:
- Date of first login
- Whether or not the student has clicked on (and assuming, viewed) the course syllabus.
If these two simple, easy-to-track variables play such a large role in predicting whether a student will succeed or fail in a course, why not track them? This allows the instructor, or student adviser, to intervene very early in the semester, which in turn greatly increases that student's chance of success.
I look forward to the onset of Penn State's new CMS, and what data-driven initiatives we can spin up to enhance student success and retention.