Archive for category Conferences
While I was in Seoul in September, I took part in the Asian Learning Analytics Summer Institute (LASI Asia). I was joined there by members of the LACE team, who included the event as part of the LACE tour of Asia, which also took in Japan and Korea.
During LASI Asia, I gave a talk about what is on the horizon for learning analytics. This went into more detail, and was aimed at a more specialist audience, than my talk at e-Learning Korea. I also took part in a couple of panel discussions. The first was on how to build an international community on learning analytics research, and the second was on the achievements of learning analytics research and next steps.
There is general agreement that the importance of learning analytics is likely to increase in the coming decade. However, little guidance for policy makers has been forthcoming from the technologists, educationalists and teachers who are driving the development of learning analytics. The Visions of the Future study was carried out by the LACE project in to order to provide some perspectives that could feed into the policy process.
The study took the form of a ‘policy Delphi’, which is to say that it was not concerned with certainty about the future, but rather focused on understanding the trends issues which will be driving the field forward in the coming years. The project partners developed eight visions of the future of learning analytics in 2025. These visions were shared with invited experts and LACE contacts through an online questionnaire, and consultation with stakeholders was carried out at events. Respondents were asked to rate the visions in terms of their feasibility and desirability, and the actions which should be taken in the light of their judgements. 487 responses to visions were received from 133 people. The views of the respondents on how the future may evolve are both interesting and entertaining. More significantly, analysis of the ratings and free text responses showed that for the experts and practitioners who engaged in the study, there was a consensus around a number of points which are shaping the future of learning analytics.
1. There is a lot of enthusiasm for Learning Analytics, but concern that its potential will not be fulfilled. It is therefore appropriate for policy makers to take a role.
2. Policies and infrastructure are necessary to strengthen the rights of the data subject.
3. Interoperability specifications and open infrastructures are an essential enabling technology. These can support the rights of the data subject, and ensure control of analytics processes at the appropriate level.
4. Learning analytics should not imply automation of teaching and learning.
The full results of the study are published in a report at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/.
In this session the visions explored by the LACE study will be presented, the conclusions discussed, and the audience will take part in an impromptu mapping of the most desirable and feasible vision of the future for learning analytics in Asia.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in April 2006, we had begun developing learning analytics for 2016, we might not have planned specifically for learning with and through social networks (Twitter was launched in July 2006), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. In this talk, Rebecca will introduce a range of different scenarios that explore different ways in which learning analytics could develop in the future. She will share the results of an international Policy Delphi study, which was designed for the systematic solicitation and collation of informed judgments on visions of learning analytics in 2025. The study explored underlying assumptions and information leading to differing judgments on learning analytics, and brought together informed judgments about the field. The findings of the Policy Delphi, together with other studies, are now being used to develop action plans that will help us to develop analytics to support learners and educators in the future.
While at the conference, I also took part in a panel discussion entitled, ‘Next step of ‘e-Learning for Smart, Connected World’
My final presentation at the LAK16 conference was another session organised by the Learning Analytics Community Exchange (LACE) project that built on our Visions of the Future work. This panel session brought participants together to discuss the next steps for learning analytics and where we are heading as a community.
It is important that the LAK community looks to the future, in order that it can help develop the policies, infrastructure and frameworks that will shape its future direction and activity. Taking as its basis the Visions of the Future study carried out by the Learning Analytics Community Exchange (LACE) project, the panelists will present future scenarios and their implications. The session will include time for the audience to discuss both the findings of the study and actions that could be taken by the LAK community in response to these findings.
Ferguson, Rebecca; Brasher, Andrew; Clow, Doug; Griffiths, Dai and Drachsler, Hendrik (2016). Learning Analytics: Visions of the Future. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
This paper explores the potential of analytics for improving accessibility of e-learning and supporting disabled learners in their studies. A comparative analysis of completion rates of disabled and non-disabled students in a large five-year dataset is presented and a wide variation in comparative retention rates is characterized. Learning analytics enable us to identify and understand such discrepancies and, in future, could be used to focus interventions to improve retention of disabled students. An agenda for onward research, focused on Critical Learning Paths, is outlined. This paper is intended to stimulate a wider interest in the potential benefits of learning analytics for institutions as they try to assure the accessibility of their e-learning and provision of support for disabled students.
Cooper, Martyn; Ferguson, Rebecca and Wolff, Annika (2016). What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students’ Learning? In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
Our second LACE workshop of LAK16 was the highly successful Failathon. The idea for this workshop emerged from an overview of learning analytics evidence provided by the LACE Evidence Hub. This suggested that the published evidence is skewed towards positive results, so we set out to find out whether this is the case.
A packed workshop discussed past failures. All accounts were governed by the Chatham House Rule – they could be reported outside the workshop as long as the source of the information was neither explicitly or implicitly identified.
As in many fields, most papers in the learning analytics literature report success or, at least, read as if they are reporting success. This is almost certainly not because learning analytics research and activity are always successful. Generally, we report our successes widely, but keep our failures to ourselves. As Bismarck is alleged to have said: it is wise to learn from the mistakes of others. This workshop offers an opportunity for researchers and practitioners to share their failures in a lower-stakes environment, to help them learn from each other’s mistakes.
Clow, Doug; Ferguson, Rebecca; Macfadyen, Leah and Prinsloo, Paul (2016). LAK Failathon. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
A busy week at the Learning Analytics and Knowledge 2016 (LAK16) conference began with a workshop on Ethics and Privacy Issues in the Design of Learning Analytics. The workshop formed part of the international EP4LA series run by the LACE project.
The workshop included a series of presentations, and I talked briefly about findings related to ethics and privacy that had emerged from the LACE Visions of the Future study.
Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions that are designed not only as an afterthought, but also with these issues in mind. The 2nd EP4LA@LAK16 workshop will bring the discussion on ethics and privacy for learning analytics to a the next level, helping to build an agenda for organizational and technical design of LA solutions, addressing the different processes of a learning analytics workflow.
Drachsler, Hendrik; Hoel, Tore; Cooper, Adam; Kismihók, Gábor; Berg, Alan; Scheffel, Maren; Chen, Weiqin and Ferguson, Rebecca (2016). Ethical and Privacy Issues in the Design of Learning Analytics Applications. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
Learning at Scale: Using an Evidence Hub To Make Sense of What We Know, Abstract
The large datasets produced by learning at scale, and the need for ways of dealing with high learner/educator ratios, mean that MOOCs and related environments are frequently used for the deployment and development of learning analytics. Despite the current proliferation of analytics, there is as yet relatively little hard evidence of their effectiveness. The Evidence Hub developed by the Learning Analytics Community Exchange (LACE) provides a way of collating and filtering the available evidence in order to support the use of analytics and to target future studies to fill the gaps in our knowledge.
Ferguson, Rebecca (2016). Learning at Scale: Using an Evidence Hub To Make Sense of What We Know. In: L@S ’16 Proceedings of the Third (2016) ACM Conference on Learning @ Scale, ACM, New York.
I also took a companion poster to the LAK16 conference, which took place later in the week in the same venue. The posters are designed (by digital media specialist Peter Devine) to stand alone or to work together.
This poster sets out the background and development of the LACE Evidence Hub, a site that gathers evidence about learning analytics in an accessible form. The poster also describes the functionality of the site, summarises its quantitative and thematic content to date and the state of evidence. In addition, it encourages people to add to and make use of the Hub.
Ferguson, Rebecca and Clow, Doug (2016). Learning Analytics Community Exchange: Evidence Hub. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April, 2016, Edinburgh, Scotland.