While in Wagga Wagga, I had the opportunity to meet many of the team working on learning initiatives at Charles Sturt University (which must be one of the only universities in the world to have its own vineyard). Cassandra Colvin was my host, and we had some fascinating discussions, but I was also able to meet with many of the learning technology staff and the senior leadership team.
I also gave a presentation on learning analytics, which was more specific than my public talk the previous day and focused on the evidence for learning analytics and how to avoid failure.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.