Archive for category Events
Had a great time on a Writing Week in the Peak District with the Open World Learning PhD students funded by Leverhulme. The project is now in its third year, so all the students are now at least a year into their studies. It was an opportunity to catch up with the students and their supervisors, to engage in fascinating discussions about how all this work links up, and to read about the work so far in drafts of chapters and reports. I think everyone in the picture below is from a different country – bringing together a set of diverse perspectives from around the world.
On 9 May I talked about the Innovating Pedagogy reports to the OpenTEL group at The Open University. My talk focused on six pedagogies that we have covered in the reports, and how these might be used within the university.
For example, the flipped classroom provides a way of preparing students for field trips, residential school, tutorials and forum discussions. Computational thinking is a skill that could be covered in different disciplines and at different levels, as well as included in the professional development of staff. Spaced learning could be incorporated within training for research students, added to study skills support, or investigated further by Psychology students.
On 19 April, I was at St Annes College, Oxford, for the Making a Difference: Social Sciences and Impact conference.
My talk focused on the work of the Learning Analytics Community Exchange(LACE) in building a European learning analytics community, and the part that The Open University played in achieving this.
Our university was among the first to engage when learning analytics emerged as a research field. We built up a solid base of expertise, which we wanted to use to achieve a positive impact on learning and teaching in a broader context. We began by holding a networking event for researchers and practitioners. From that emerged the Learning Analytics Community Exchange (LACE) project, designed to build European expertise in this area.
The project included an impact plan, with performance indicators that enabled us to evaluate that impact. The plan included broad groups – higher education, schools, industry and informal learning – as well as different communities: policy makers, researchers, unions, educators and trainers.
We created a networking plan, identifying key events internationally. We also scheduled our own events, building on the one that had formed the basis of the project, and adding events for policy makers. We developed a site for sharing research evidence and aligned it with the major conference in the area. We drew up a detailed Social Media plan and established a strong presence on YouTube, Twitter and LinkedIn. We also took the lead in developing international standards and ethical practice in learning analytics.
While in Wagga Wagga, I had the opportunity to meet many of the team working on learning initiatives at Charles Sturt University (which must be one of the only universities in the world to have its own vineyard). Cassandra Colvin was my host, and we had some fascinating discussions, but I was also able to meet with many of the learning technology staff and the senior leadership team.
I also gave a presentation on learning analytics, which was more specific than my public talk the previous day and focused on the evidence for learning analytics and how to avoid failure.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
After Melbourne, my next stop was the town of Wagga Wagga, where Cassandra Colvin had invited me to talk about learning analytics.
I gave a public lecture at Riverina TAFE, which explored the use of learning analytics and the role they may play in the future of learning. This event was organised by u!magine and was open to all educators in the Wagga area – from K-12 through to Higher Education – interested in using learning analytics to inform their practice.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2007, we had begun developing learning analytics for 2017, we might not have planned specifically for learning with and through social networks (Twitter was only a year old), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account.
After the LAK conference in Sydney, I was invited to RMIT University in Melbourne by Pablo Munguia to meet the team and talk about learning analytics.
The former OU pro-vice chancellor, Belinda Tynan, who took the lead on learning analytics at The Open University is now at RMIT, so it was really interesting to see how a similar implementation strategy is playing out at a different university. The priorities at RMIT are different – student retention is not a significant issue there – so their analytics are focused on students achieving their goals.
I gave a presentation to the team on the future of learning analytics, which complemented a presentation by my colleague, Bart Rienties, on the practicalities and successes of learning analytics implementation.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2008, we had begun developing learning analytics for 2018, we might not have planned specifically for learning with and through social networks (Twitter was still in its infancy), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account. In particular, the learning analytics community needs to work together in order to develop a strong evidence base grounded in both research and practice.
A highlight of my year will surely be LAK18 – the annual Learning Analytics and Knowledge conference run by the Society for Learning Analytics Research (SoLAR). Together with Simon Buckingham Shum, Xavier Ochoa and Agathe Merceron, I was programme chair for the conference.
Our five days in Sydney were the culmination of more than a year’s hard work. We were really pleased with the attendance and the engagement at the conference, and the success of new initiatives such as double-blind peer review and the introduction of discussion around meta-reviews of the papers.
The next steps for us will be two special sections of the Journal of Learning Analytics – one related to the conference theme of human-centred design, and one including extended versions of the best papers. I shall also be ex officio programme chair of the next conference, at Arizona State in 2019.
The conference also provided a chance for many of the Learning Analytics Community Europe (LACE) organising team to meet up and make plans for the future.