This event was held at the University of Central Lancashire (UCLAN) in Preson, and was organised by the VITAL project (Visualisation tools and analytics to monitor language learning and teaching).
My talk was on ‘Learning analytics: planning for the future’.
What does the future hold for learning analytics? In terms of Europe’s current priorities for education and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should help improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as linguistic abilities, cultural awareness and active-citizenship. This is a challenging agenda that requires us to look beyond our immediate priorities and institutional goals. In order to address this agenda, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can develop an action plan that will drive the development of analytics that enhance both learning and teaching.
Together with Liz FitzGerald and Eileen Scanlon, I chaired the 38th annual conference of the Computers and Learning research group (CALRG), which took place at The Open University 16-18 June 2017. We enjoyed keynote presentations from Siân Bayne, Jenny Preece and Ben Shneiderman.
Full details of the conference, together with links to all the abstracts and to many of the presentations, are available on Cloudworks.
The third day of the conference was FutureLearn Academic Network day. This annual conference event prioritises the work of doctoral students within the FLAN Network. This year, it brought together presenters from Bath, Lancaster, Purdue (USA), Sheffield, Southampton, The Open University, and Warwick.
Our discussant was Professor Rupert Wegerif, University of Cambridge.
Members of FLAN can access the video of the event.
A sunny week in May away in the Peak District with most of the Leverhulme-funded PhD students in open world learning and many of their supervisors. Lots of writing was done, but also a lot of community building.
Scattered between my research presentations at LAK17 was my work as a member of the executive for the Society for Learning Analytics Research (SoLAR). The executive met daily during the conference – it is the only chance we have each year for face-to-face meetings. The LAK conferences also provide a venue for the AGM of the society and, despite the size of the room, where the AGM was held, it was standing room only for most of the meeting.
The executive also have a role to play in decisions about the conference itself, as well as acting as reviewers on the programme committee and chairs for the different sessions. Next year, at LAK18 in Vancouver, I shall be taking on a bigger role, as one of the programme chairs for the conference.
The picture shows me with half the SoLAR Executive at the post-LAK17 review meeting.
The European FP7-funded learning analytics community exchange (LACE) project came to an end last June. Since then, we have become a special interest group (SIG) of the Society for Learning Analytics Research (SoLAR) and we are now the learning analytics community Europe (LACE).
Although the loss of large-scale funding has meant scaling down our activities, we have still been active and our Twitter account reflects some of that work – including presentations on European learning analytics work in China, Japan and South Korea.
The LAK17 conference provided a chance for eight of the international team to get together and plan our next event, a workshop in our ethics and privacy in learning analytics series (EP4LA) that we are submitting to this year’s ECTEL conference.
Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.
We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.
Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.
Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.
The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.