“Learning Analytics” is a term used to describe an emerging professional practice that systematically applies statistics and research methods to large “n” data sets. Analysts look for patterns among the analyzed results that can inform more accountable decision-making. It’s not aimed at doing “pure” research, at least not the kind of research that most of us in the academy think of. Instead, it focuses on mining existing sets of student and faculty data, where statistical methods we already use are applied to records that have been “FERPA-ized”. The results can be used to show education stakeholders – student services and academic advisors, faculty and students themselves – what is working in their learning and teaching practices and what is not. Learning analytics uses descriptive statistics, inferential statistics, and predictive statistics on the data sets that are collected from, well, just about everything associated with online learning. eLearning. Distance learning. Whatever you call this thing that we all do.
Just think about it – vast numbers of today’s online learning transactions are regularly captured in Content Management Systems, Learning Management Systems, Student Information Systems, and Enterprise Resource Planning Systems. We have a ton of data describing A LOT about what happens when students – and faculty – are online.
Given what we experience in the consumer part of our lives – the rich array of web analytics available in Google, and the personalized recommendation engines in Netflix and Amazon, the ability to find a very special person using eHarmony and Match.com, create music playlists with Apple Genius and leveraging the power of hastags in Twitter and the “thumbs up” in Pandora and Facebook…why WOULDN’T we want to mine those troves of data already maintained in our systems and institutions to personalize our learning experiences in the same way?
Well, there are few reasons why we might not want to do this….some criticisms come from methodological circles, with dismissive sniffs taken at the whole notion of data mining being so ad hoc. Other criticisms come from privacy experts who are concerned that there are Orwellian overtones to this kind of thinking. People seeking to stretch FERPA beyond its actual intent come to mind.
But the benefits and opportunities hold enough promise that we think it’s time to give what Gartner Research (2010) calls “pattern strategy” a shot at helping us focus on improving the services and experiences we offer our education stakeholders in the era of Web 2.0. And THAT is where learning analytics step in.
I continue to believe our initial success with learning analytics will come from mining the data sets that exist in our institutional technology platforms and putting those data we already have in hand to better use. Once we can take the proverbial “training wheels” off, we will want to consider collecting new kinds of data that may actually be more relevant than what we have been collecting the old fashioned way. One such example is “gamification” (I’ll have a lot more to say about this growing phenomenon in a future Frontiers post) for now just let me remind you that there are millions and millions of people who willingly leave behind millions and millions of transactional data points while engaging in online games. Lest you think this is all about Farmville or World of Warcraft….America’s Army, a massive online multi-user game originally developed as a recruiting tool, is also being used to help tests aptitudes and for simulating performance situations where performance can be observed. Very effectively, I might add. These are not bad things. In the case of America’s Army, these simulated exercises may actually help save lives by offering opportunities to practice before finding oneself in harm’s way.
Of course, not everyone is excited at the prospect of post-secondary education descending to the depths of computer games. We still have a lot to figure out regarding the times and places where pattern strategy (in general) and learning analytics (in particular) may be relevant in the world of post-secondary education. Lots of cultural issues are going to raise their heads as we move toward greater expectations for transparency. We will need to establish inter-institutional collaboration frameworks and guidelines for people to agree to share their data sets. Once you have the numbers in hand about what is working and what is not, then we’re going to need to be okay with doing something to fix – or extend – what we uncover.
This is the hardest part, the “admitting we may have a problem” part. Once you take a look at the evidence, you need to be prepared to respond. Even more than that – you will need to get a lot more comfortable about the fact that there is no such thing as “kinda transparent”. As with any disruptive innovation, the benefits of learning analytics will be balanced with some big cultural resistances to adoption that will require direct and focused attention.
I just have to say…We’re looking forward to applying a lot more of our attention to figuring out how to make this work for education stakeholders. Ought to be an interesting ride. Hope we’ll be meeting up with you over at the WCET corral.