Archive for category Educational Futures
I visited Bergen in Norway at the end of September to keynote at Nordic LASI. This is one of a series of learning analytics summer institutes run around the world in conjunction with the Society for Learning Analytic Research (SoLAR). The event was well attended, with participants from Russia, Norway, Denmark and Sweden.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2007, we had begun developing learning analytics for 2017, we might not have planned specifically for learning with and through social networks (Twitter was only a year old), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account. In particular, the learning analytics community needs to work together in order to develop a strong evidence base grounded in both research and practice.
Last week, I visited the beautiful town of Bergen to visit the SLATE Centre at the university there. SLATE is a global research centre, designed for the advancement of the learning sciences. Its mission is to advance the frontiers of the science of learning and technology through integrated research. I was able to meet many of the team and talk to them about their research.
While at SLATE, I gave a talk about developing a Vision and an Action Plan for learning analytics – and for other educational innovations. SLATE is well placed to make a difference both nationally and internationally, so their vision has the potential to affect tens of thousands of learnrs in different countries.
Here is SLATE’s account of my talk.
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to help people to move from small-scale initiatives towards large-scale implementation. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building.
We have just published an internal report for The Open University. It covers ‘Staff Perspectives on the Value of Involvement with FutureLearn MOOCs’. The report – authored by Tom Coughlan, Thea Herodotou, Alice Peasgood and myself – continues our series of reports on different aspects of engagement and research with MOOCs.
We carried out interviews with educators, production staff and facilitators who work on both MOOCs and Open University courses. Analysis of these data identified six forms of value that these MOOCs offer to the university.
- Innovating course production
- Staff development
- Visibility and engagement
- Improved learning journeys
- Research and evaluation
- Income generation
In each case, the report identifies both benefits and challenges.
Open University staff can access the full report.
This event was held at the University of Central Lancashire (UCLAN) in Preson, and was organised by the VITAL project (Visualisation tools and analytics to monitor language learning and teaching).
My talk was on ‘Learning analytics: planning for the future’.
What does the future hold for learning analytics? In terms of Europe’s current priorities for education and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should help improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as linguistic abilities, cultural awareness and active-citizenship. This is a challenging agenda that requires us to look beyond our immediate priorities and institutional goals. In order to address this agenda, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can develop an action plan that will drive the development of analytics that enhance both learning and teaching.
Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.
We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.
Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.