Archive for category Presentations
Yesterday, I was keynote speaker at the University of Leeds Centre for Research in Digital Education Research Symposium, hosted by Neil Norris and Bronwen Swinnerton.
Innovating pedagogy From educational radio and television, through virtual learning environments, to facial recognition of students and hologram lecturers – when people think of innovation in education, they tend to think of the technology used to deliver it. This technology has helped to extend access to education, but technology alone cannot bring about deep and sustained improvements in the quality of learning. The Innovating Pedagogy reports shift the emphasis towards innovations in pedagogy: identifying new forms of teaching, learning and assessment to guide educators. These innovations can be used to help learners deal with a changing world in which they need to make sense of increasing amounts of data and information, and make the most of their opportunities to make global connections. In her keynote, Rebecca Ferguson will talk about new and updated pedagogies that can be put into practice in the classroom, the ideas that connect them and the skills that support them. Some of these approaches extend current practice, some personalise it, some enrich it and others explore new possibilities that have opened up in the past decade.
You can watch the keynote here.
Another opportunity to talk to OU practitioners about the experience of putting an OU qualification on FutureLearn. This time the event was organised by the STEM (Science, Technology, Engineering and Mathematics) Faculty at The Open University and was the annual meeting of their Taught Postgraduate Group.
Learning analytics have the potential to help us to identify and make sense of patterns in educational data in order to enhance our teaching, our learning, and the student experience. Since emerging as a distinct field in 2011, learning analytics has grown rapidly. Institutions around the world are already developing and deploying these new tools. In order to use analytics effectively, teachers need to take time to reflect on their aims and relevant skillsets. What does enhancement mean in different contexts, and how can analytics be used to help achieve that goal? In order to use these tools effectively, one of the things we need to do is to look into the future and consider the changes on the horizon. In her talk, Rebecca will talk about current developments in learning analytics. She will also introduce ‘Analytics in Action’ – a framework that can be used to introduce analytics to support enhancement – and will consider its implications from a teaching perspective.
Nuovi orizzonti della ricerca pedagogica: evidence-based learning e learning analytics Giornata di studi
22 Novembre dalle 9.30 alle 13.00, Palazzo Malcanton Marcorà – Aula Valent
Ore 9.30 Apertura dei lavori
Coordina: M. Costa (Università Ca’ Foscari Venezia)
Ore 9.40 Nuovi orizzonti della ricerca pedagogica
U. Margiotta (Università Ca’ Foscari Venezia)
Ore 10.00 Il contributo delle tecniche di Learning Analytics ai settori del Learning Design e dell’autoregolazione dell’apprendimento.
D. Persico (CNR-Genova)
Ore 11.00 Learning Analytics futures: a teaching perspective
Guest speaker: R. Ferguson (Open University UK)
Discute: P. de Waal (Università Ca’ Foscari Venezia)
Ore 13.00 Chiusura dei lavori
On 17 June 2018, I gave a keynote to the 15th Enhancement Conference of the Quality Assessment Agency (QAA) at Glasgow Caledonian University. The conference theme was Evaluation, Evidence & Enhancement: Inspiring Staff & Students. I also recorded a short video interview that considers the links between learning analytics and learning enhancement.
Learning analytics help us to identify and make sense of patterns in educational data in order to enhance our teaching, our learning, and the student experience. Since emerging as a distinct field in 2011, learning analytics has grown rapidly. Institutions around the world are already developing and deploying these new tools. In order to use analytics effectively, we need to take time to reflect on our aims. What does enhancement mean in our context, and how can analytics help us to achieve that goal? In order to do this effectively, one of the things we need to do is to look into the future and consider the changes that are likely to have taken effect by the time our analytics are up and running. In this talk, Rebecca will talk about the current state of learning analytics and the many possibilities on the horizon. She will also introduce ‘Analytics in Action’ – a framework that can be used to introduce analytics to support enhancement.
On 9 May I talked about the Innovating Pedagogy reports to the OpenTEL group at The Open University. My talk focused on six pedagogies that we have covered in the reports, and how these might be used within the university.
For example, the flipped classroom provides a way of preparing students for field trips, residential school, tutorials and forum discussions. Computational thinking is a skill that could be covered in different disciplines and at different levels, as well as included in the professional development of staff. Spaced learning could be incorporated within training for research students, added to study skills support, or investigated further by Psychology students.
On 19 April, I was at St Annes College, Oxford, for the Making a Difference: Social Sciences and Impact conference.
My talk focused on the work of the Learning Analytics Community Exchange(LACE) in building a European learning analytics community, and the part that The Open University played in achieving this.
Our university was among the first to engage when learning analytics emerged as a research field. We built up a solid base of expertise, which we wanted to use to achieve a positive impact on learning and teaching in a broader context. We began by holding a networking event for researchers and practitioners. From that emerged the Learning Analytics Community Exchange (LACE) project, designed to build European expertise in this area.
The project included an impact plan, with performance indicators that enabled us to evaluate that impact. The plan included broad groups – higher education, schools, industry and informal learning – as well as different communities: policy makers, researchers, unions, educators and trainers.
We created a networking plan, identifying key events internationally. We also scheduled our own events, building on the one that had formed the basis of the project, and adding events for policy makers. We developed a site for sharing research evidence and aligned it with the major conference in the area. We drew up a detailed Social Media plan and established a strong presence on YouTube, Twitter and LinkedIn. We also took the lead in developing international standards and ethical practice in learning analytics.
While in Wagga Wagga, I had the opportunity to meet many of the team working on learning initiatives at Charles Sturt University (which must be one of the only universities in the world to have its own vineyard). Cassandra Colvin was my host, and we had some fascinating discussions, but I was also able to meet with many of the learning technology staff and the senior leadership team.
I also gave a presentation on learning analytics, which was more specific than my public talk the previous day and focused on the evidence for learning analytics and how to avoid failure.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.