Archive for category Analytics
On 19 April, I was at St Annes College, Oxford, for the Making a Difference: Social Sciences and Impact conference.
My talk focused on the work of the Learning Analytics Community Exchange(LACE) in building a European learning analytics community, and the part that The Open University played in achieving this.
Our university was among the first to engage when learning analytics emerged as a research field. We built up a solid base of expertise, which we wanted to use to achieve a positive impact on learning and teaching in a broader context. We began by holding a networking event for researchers and practitioners. From that emerged the Learning Analytics Community Exchange (LACE) project, designed to build European expertise in this area.
The project included an impact plan, with performance indicators that enabled us to evaluate that impact. The plan included broad groups – higher education, schools, industry and informal learning – as well as different communities: policy makers, researchers, unions, educators and trainers.
We created a networking plan, identifying key events internationally. We also scheduled our own events, building on the one that had formed the basis of the project, and adding events for policy makers. We developed a site for sharing research evidence and aligned it with the major conference in the area. We drew up a detailed Social Media plan and established a strong presence on YouTube, Twitter and LinkedIn. We also took the lead in developing international standards and ethical practice in learning analytics.
While in Wagga Wagga, I had the opportunity to meet many of the team working on learning initiatives at Charles Sturt University (which must be one of the only universities in the world to have its own vineyard). Cassandra Colvin was my host, and we had some fascinating discussions, but I was also able to meet with many of the learning technology staff and the senior leadership team.
I also gave a presentation on learning analytics, which was more specific than my public talk the previous day and focused on the evidence for learning analytics and how to avoid failure.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
After Melbourne, my next stop was the town of Wagga Wagga, where Cassandra Colvin had invited me to talk about learning analytics.
I gave a public lecture at Riverina TAFE, which explored the use of learning analytics and the role they may play in the future of learning. This event was organised by u!magine and was open to all educators in the Wagga area – from K-12 through to Higher Education – interested in using learning analytics to inform their practice.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2007, we had begun developing learning analytics for 2017, we might not have planned specifically for learning with and through social networks (Twitter was only a year old), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account.
After the LAK conference in Sydney, I was invited to RMIT University in Melbourne by Pablo Munguia to meet the team and talk about learning analytics.
The former OU pro-vice chancellor, Belinda Tynan, who took the lead on learning analytics at The Open University is now at RMIT, so it was really interesting to see how a similar implementation strategy is playing out at a different university. The priorities at RMIT are different – student retention is not a significant issue there – so their analytics are focused on students achieving their goals.
I gave a presentation to the team on the future of learning analytics, which complemented a presentation by my colleague, Bart Rienties, on the practicalities and successes of learning analytics implementation.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2008, we had begun developing learning analytics for 2018, we might not have planned specifically for learning with and through social networks (Twitter was still in its infancy), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account. In particular, the learning analytics community needs to work together in order to develop a strong evidence base grounded in both research and practice.
A highlight of my year will surely be LAK18 – the annual Learning Analytics and Knowledge conference run by the Society for Learning Analytics Research (SoLAR). Together with Simon Buckingham Shum, Xavier Ochoa and Agathe Merceron, I was programme chair for the conference.
Our five days in Sydney were the culmination of more than a year’s hard work. We were really pleased with the attendance and the engagement at the conference, and the success of new initiatives such as double-blind peer review and the introduction of discussion around meta-reviews of the papers.
The next steps for us will be two special sections of the Journal of Learning Analytics – one related to the conference theme of human-centred design, and one including extended versions of the best papers. I shall also be ex officio programme chair of the next conference, at Arizona State in 2019.
The conference also provided a chance for many of the Learning Analytics Community Europe (LACE) organising team to meet up and make plans for the future.
My first term of office on the Executive Committee of the Society for Learning Analytics Research (SoLAR) came to an end early this year. I have spent the last year working for the society by acting as one of the Programme Chairs of its annual LAK conference, attending monthly online meetings, and contributing to debate about the society’s initiatives.
I was nominated to stand for election as president-in-waiting of the society, but chose not to put myself forward for this demanding post. However, I did stand for the executive once again and was delighted to be notified on my birthday that I had been re-elected by members of the society as a member at large.
I am just back from an expert workshop held at the European Commission’s Joint Research Centre (JRC) in Seville.
The EU has a very large database, covering 12 years, related to a European-wide project called etwinning. This project puts teachers in touch with each other across Europe so that they can share ideas and innovation, develop their professional and digital skills and, specifically, join together to develop and carry out projects involving their pupils. The database covers activity and interactions on that platform by many thousands of individual teachers.
The JRC is interested in using this dataset to generate actionable insights that can help teachers and learners across Europe. The expert workshop brought together researchers from across Europe to discuss different ways of doing this. The participants brought many different perspectives to the event – some had worked with the platform for years, some came from Ministries of Education, others had explored large educational datasets in the past or had organised large studies.
Together, we identified different questions that the database could help to answer, and discussed ways in which it could be related to external data sources.