Archive for category Presentations
I joined a team of experts from across The Open University to contribute to the BBC Learning English co-production, Go The Distance: ‘a 10-week taste of what distance learning is really like – with real students, real tutors, key study and digital literacy skills and lots of help with your English.’
My contribution was to Academic Insights ‘the series where we meet real distance learning tutors and get their top tips for successful studying.’
You can watch the video via the BBC site or via OpenLearn.
- My name’s Rebecca Ferguson. I work as a lecturer in distance learning. My field is educational technology.
- There are several reasons for working together. One of them is because it’s a way of learning in itself. You share perspectives and you discuss things. The second reason is it’s a very effective way of learning. And the third reason is employability. You need to be able to work with your team.
- Student collaborative tasks depend on the level of study. They might be contributing to a forum; they might be responding to somebody else in a forum. But when you get to final years you’d be working on a project with others. You might be carrying out research with others.
- Shyness and confidence can be a problem for some students especially when they’re in video conferences but in forums it’s a very good way of communicating if you’re shy.
- Something that a tutor can do is to encourage people to introduce themselves and to talk on a safe subject that they don’t feel stressed about, just introduce themselves and deal with something relatively impersonal.
- A solution for that is to share information about when you can work and for how long you can work. Another solution is to timetable how you’re going to work together.
- Learners feel that it’s very beneficial because it reflects what they’re going to be doing in a working environment. It’s something they felt unconfident about before and they now know how to do it.
Following the retirement of Mike Sharples (who will return to The Open University as an Emeritus Professor in March). I have taken on the role of Academic Coordinator for the FutureLearn Academic Network (FLAN).
The network was established in 2013 by a group of academics in order to connect academics and research students based at FutureLearn partner institutions, share research, and explore shared research opportunities. These include: joint research bids and publications, comparative studies using shared FutureLearn data, course designs, and methods to analyse and evaluate courses.
The Network is open to staff and research students based at FutureLearn partner institutions with an interest in research related to the FutureLearn platform.
On 7 November, we held one of our quarterly meetings – this time at the British Council in Central London. Among the many interesting talks:
- Josh Underwood gave a detailed and considered account of the role of a mentor or facilitator within FutureLearn courses.
- Matthew Nicholls and Bunny Waring talked about their use of a virtual reality simulation of Rome in the 4th century CE.
- Phil Tubman introduced a tool for visualising discussion, which is now being used on a course from Lancaster University.
- Eileen Scanlon and I talked about research ethics on the platform and initiated discussion on changes to the terms and conditions.
The next meeting of FLAN is likely to be in Exeter at the end of February 2018. If you are eligible to be part of FLAN and would like to be involved either in person or remotely, do get in touch.
I visited Bergen in Norway at the end of September to keynote at Nordic LASI. This is one of a series of learning analytics summer institutes run around the world in conjunction with the Society for Learning Analytic Research (SoLAR). The event was well attended, with participants from Russia, Norway, Denmark and Sweden.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2007, we had begun developing learning analytics for 2017, we might not have planned specifically for learning with and through social networks (Twitter was only a year old), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account. In particular, the learning analytics community needs to work together in order to develop a strong evidence base grounded in both research and practice.
Last week, I visited the beautiful town of Bergen to visit the SLATE Centre at the university there. SLATE is a global research centre, designed for the advancement of the learning sciences. Its mission is to advance the frontiers of the science of learning and technology through integrated research. I was able to meet many of the team and talk to them about their research.
While at SLATE, I gave a talk about developing a Vision and an Action Plan for learning analytics – and for other educational innovations. SLATE is well placed to make a difference both nationally and internationally, so their vision has the potential to affect tens of thousands of learnrs in different countries.
Here is SLATE’s account of my talk.
The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to help people to move from small-scale initiatives towards large-scale implementation. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building.
This event was held at the University of Central Lancashire (UCLAN) in Preson, and was organised by the VITAL project (Visualisation tools and analytics to monitor language learning and teaching).
My talk was on ‘Learning analytics: planning for the future’.
What does the future hold for learning analytics? In terms of Europe’s current priorities for education and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should help improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as linguistic abilities, cultural awareness and active-citizenship. This is a challenging agenda that requires us to look beyond our immediate priorities and institutional goals. In order to address this agenda, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can develop an action plan that will drive the development of analytics that enhance both learning and teaching.
Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.
The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.