Archive for category Presentations
Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.
The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.
On 27 January, I travelled to Pompeu Fabra university in Barcelona for a meeting of the FutureLearn Academic Network (FLAN) on The Educator Experience. This was the first FLAN meeting to take place outside the UK and it was held at UPF’s Poblenou Campus. The event was organised by CLIK (Center for Learning, Innovation and Knowledge) and the members of the Educational Technologies section within the Interactive Technologies Research Group of UPF.
During the meeting, FutureLearn partners reflected on the impact and research possibilities of MOOC in the field of education. Sir Timothy O’Shea, Principal and Vice-Chancellor of the University of Edinburgh, gave the keynote speech, describing Edinburgh’s developing MOOC strategy, including the production of 64 online master’s courses.
I talked about our recent report MOOCs; What the Research of FutureLearn’s UK Partners Tells Us
If you have access to the FutureLearn Partners’ blog, a video of the meeting and summary notes of the sessions are available.
On 25 January, I presented at the BETT trade show on An action plan for learning analytics. If you would like to introduce learning analytics at your institution, where should you start? Drawing on recent studies that consulted experts worldwide, I outlined an action plan for analytics and identified the key points to keep in mind.
My talk formed part of the HE Leaders Summit, a section of the event that was designed to address some of the most significant challenges currently facing senior leaders across Higher Education.
On 23 January I presented at a joint symposium involving The Open University and the University of Gothenburg. Eleven participants from Gothenburg met with ten Open University researchers. Eight presentations, four from Gothenburg and four from The Open University allowed discussion of areas of mutual interest.
My presentation focused on what the UK research carried out by UK partners in the FutureLearn platform tells us. I presented a longer version of the talk to the FutureLearn Academic Network (FLAN) later in the week, so it is embedded in a later blog post.
While I was in Seoul in September, I took part in the Asian Learning Analytics Summer Institute (LASI Asia). I was joined there by members of the LACE team, who included the event as part of the LACE tour of Asia, which also took in Japan and Korea.
During LASI Asia, I gave a talk about what is on the horizon for learning analytics. This went into more detail, and was aimed at a more specialist audience, than my talk at e-Learning Korea. I also took part in a couple of panel discussions. The first was on how to build an international community on learning analytics research, and the second was on the achievements of learning analytics research and next steps.
There is general agreement that the importance of learning analytics is likely to increase in the coming decade. However, little guidance for policy makers has been forthcoming from the technologists, educationalists and teachers who are driving the development of learning analytics. The Visions of the Future study was carried out by the LACE project in to order to provide some perspectives that could feed into the policy process.
The study took the form of a ‘policy Delphi’, which is to say that it was not concerned with certainty about the future, but rather focused on understanding the trends issues which will be driving the field forward in the coming years. The project partners developed eight visions of the future of learning analytics in 2025. These visions were shared with invited experts and LACE contacts through an online questionnaire, and consultation with stakeholders was carried out at events. Respondents were asked to rate the visions in terms of their feasibility and desirability, and the actions which should be taken in the light of their judgements. 487 responses to visions were received from 133 people. The views of the respondents on how the future may evolve are both interesting and entertaining. More significantly, analysis of the ratings and free text responses showed that for the experts and practitioners who engaged in the study, there was a consensus around a number of points which are shaping the future of learning analytics.
1. There is a lot of enthusiasm for Learning Analytics, but concern that its potential will not be fulfilled. It is therefore appropriate for policy makers to take a role.
2. Policies and infrastructure are necessary to strengthen the rights of the data subject.
3. Interoperability specifications and open infrastructures are an essential enabling technology. These can support the rights of the data subject, and ensure control of analytics processes at the appropriate level.
4. Learning analytics should not imply automation of teaching and learning.
The full results of the study are published in a report at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/.
In this session the visions explored by the LACE study will be presented, the conclusions discussed, and the audience will take part in an impromptu mapping of the most desirable and feasible vision of the future for learning analytics in Asia.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in April 2006, we had begun developing learning analytics for 2016, we might not have planned specifically for learning with and through social networks (Twitter was launched in July 2006), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. In this talk, Rebecca will introduce a range of different scenarios that explore different ways in which learning analytics could develop in the future. She will share the results of an international Policy Delphi study, which was designed for the systematic solicitation and collation of informed judgments on visions of learning analytics in 2025. The study explored underlying assumptions and information leading to differing judgments on learning analytics, and brought together informed judgments about the field. The findings of the Policy Delphi, together with other studies, are now being used to develop action plans that will help us to develop analytics to support learners and educators in the future.
While at the conference, I also took part in a panel discussion entitled, ‘Next step of ‘e-Learning for Smart, Connected World’