Posts Tagged LACE project
The European FP7-funded learning analytics community exchange (LACE) project came to an end last June. Since then, we have become a special interest group (SIG) of the Society for Learning Analytics Research (SoLAR) and we are now the learning analytics community Europe (LACE).
Although the loss of large-scale funding has meant scaling down our activities, we have still been active and our Twitter account reflects some of that work – including presentations on European learning analytics work in China, Japan and South Korea.
The LAK17 conference provided a chance for eight of the international team to get together and plan our next event, a workshop in our ethics and privacy in learning analytics series (EP4LA) that we are submitting to this year’s ECTEL conference.
While I was in Seoul in September, I took part in the Asian Learning Analytics Summer Institute (LASI Asia). I was joined there by members of the LACE team, who included the event as part of the LACE tour of Asia, which also took in Japan and Korea.
During LASI Asia, I gave a talk about what is on the horizon for learning analytics. This went into more detail, and was aimed at a more specialist audience, than my talk at e-Learning Korea. I also took part in a couple of panel discussions. The first was on how to build an international community on learning analytics research, and the second was on the achievements of learning analytics research and next steps.
There is general agreement that the importance of learning analytics is likely to increase in the coming decade. However, little guidance for policy makers has been forthcoming from the technologists, educationalists and teachers who are driving the development of learning analytics. The Visions of the Future study was carried out by the LACE project in to order to provide some perspectives that could feed into the policy process.
The study took the form of a ‘policy Delphi’, which is to say that it was not concerned with certainty about the future, but rather focused on understanding the trends issues which will be driving the field forward in the coming years. The project partners developed eight visions of the future of learning analytics in 2025. These visions were shared with invited experts and LACE contacts through an online questionnaire, and consultation with stakeholders was carried out at events. Respondents were asked to rate the visions in terms of their feasibility and desirability, and the actions which should be taken in the light of their judgements. 487 responses to visions were received from 133 people. The views of the respondents on how the future may evolve are both interesting and entertaining. More significantly, analysis of the ratings and free text responses showed that for the experts and practitioners who engaged in the study, there was a consensus around a number of points which are shaping the future of learning analytics.
1. There is a lot of enthusiasm for Learning Analytics, but concern that its potential will not be fulfilled. It is therefore appropriate for policy makers to take a role.
2. Policies and infrastructure are necessary to strengthen the rights of the data subject.
3. Interoperability specifications and open infrastructures are an essential enabling technology. These can support the rights of the data subject, and ensure control of analytics processes at the appropriate level.
4. Learning analytics should not imply automation of teaching and learning.
The full results of the study are published in a report at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/.
In this session the visions explored by the LACE study will be presented, the conclusions discussed, and the audience will take part in an impromptu mapping of the most desirable and feasible vision of the future for learning analytics in Asia.
While I was in Montevideo, at the invitation of Plan Ceibal, I was interviewed about learning analytics. This playlist of four short videos (subtitled in Spanish) deals with the potential of Big Data to improve learning, how The Open University has used learning analytics, and the work of the LACE and LAEP projects.
I talk about how analytics can be used to identify when students are dropping behind, how they can be used to identify successful routes through courses, and how they can identify types of learning design that lead to student success.
I note that the supply of learning analytics is growing, but it is not clear that the demand is growing in the same way. Researchers and developers need to engage more with educators at every stage in order to identify the problems they need to be solved and the questions that they need to have answered.
I also talk about the need to align learning analytics with strategic priorities for education and training, not only at institutional level, but also at national and international level.
My videos are followed in the playlist with videos from Professor Dragan Gasevic, chair of the Society for Learning Analytics (SoLAR).
Learning at Scale: Using an Evidence Hub To Make Sense of What We Know, Abstract
The large datasets produced by learning at scale, and the need for ways of dealing with high learner/educator ratios, mean that MOOCs and related environments are frequently used for the deployment and development of learning analytics. Despite the current proliferation of analytics, there is as yet relatively little hard evidence of their effectiveness. The Evidence Hub developed by the Learning Analytics Community Exchange (LACE) provides a way of collating and filtering the available evidence in order to support the use of analytics and to target future studies to fill the gaps in our knowledge.
Ferguson, Rebecca (2016). Learning at Scale: Using an Evidence Hub To Make Sense of What We Know. In: L@S ’16 Proceedings of the Third (2016) ACM Conference on Learning @ Scale, ACM, New York.
I also took a companion poster to the LAK16 conference, which took place later in the week in the same venue. The posters are designed (by digital media specialist Peter Devine) to stand alone or to work together.
This poster sets out the background and development of the LACE Evidence Hub, a site that gathers evidence about learning analytics in an accessible form. The poster also describes the functionality of the site, summarises its quantitative and thematic content to date and the state of evidence. In addition, it encourages people to add to and make use of the Hub.
Ferguson, Rebecca and Clow, Doug (2016). Learning Analytics Community Exchange: Evidence Hub. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April, 2016, Edinburgh, Scotland.
On 15 April, the LACE project held a one-day briefing and workshop in Brussels on Policies for Educational Data Mining and Learning Analytics. Originally planned to take place in the European Parliament, a security alert required a move to the nearby Thon Hotel.
The day began with a welcome from Julie Ward, MEP for the North West of England and member of the Culture and Education Committee. She was followed by Robert Madelin (Director-General of DG Connect) and Dragan Gašević (president-elect of SoLAR). Their talks were followed by overviews of the current European-funded learning analytics projects: LACE, Lea’s Box, PELARS and WatchMe.
During the afternoon discussion and review session, participants from across Europe worked together in three separate discussion groups to review specific issues related to the use of learning analytics in schools, universities and workplace training.
I worked as rapporteur in the universities workshop (pictured), which had 186 participants, including people from England, Estonia, Germany, the Netherlands, Norway, Scotland and Sweden. Our policy recommendations included:
- Privacy and ethical issues are important. Encourage institutions to develop policies covering privacy, ethics and data protection. However, this is a broader issue than educational policy making and legislation. We should aim to influence the wider debate.
- Guard against data degradation – develop and make available methods of retaining data over time
- Develop data standards and encourage their use so that we have standardisation of data
- Address the problem of over-claiming and mis-selling by vendors – institutions do not necessarily have access to the expertise that allow them to interpret and assess these claims
- Need to identify procedure for due diligence around intervention strategies, the competencies do staff need, and certification opportunities relating to these
- Identify requirements for data collection, and structures for doing this on a sector or national basis
- Support the development of standard datasets at national or international level, against which other data can be compared to see if performance is above or below the norm
- Identify behaviours in the field of education that regional or national governments should support and encourage
- Identify ways of preventing the providers of educational tools selling our own data back to us.
- Take into account that it is not just the data we are concerned about, because once it is removed from its context it does not necessarily make sense. Data needs to be associated with metadata that is produced using standardised conventions
On 24 October 2014, the Learning Analytics and Community Exchange (LACE) project invited everyone interested in the research and use of learning analytics to a one-day networking gathering event in October at the Open University in Milton Keynes (UK).
This Solar Flare event – co-chaired by Doug Clow, Simon Cross and I – formed part of an international series coordinated by the Society for Learning Analytics Research (SoLAR). SoLAR Flares provide opportunities to learn what’s going on in learning analytics research and practice, to share resources and experience, and to forge valuable new connections within the education sector and beyond.
Around 50 people attended in person, with another 356 from around the world tuning in via the livestream.
There were two keynotes: one from Alan Berg, talking about the Apereo learning analytics initiative, and another from Chris Lowis, talking about learning analytics on the FutureLearn MOOC platform. In addition, there were 13 lightning presentations from people working with learning analytics in multiple countries and contexts including the UK, France and Spain. My lightning presentation focused on patterns of engagement identified in FutureLearn MOOCs from a variety of different universities. In the afternoon, participants split into four sub-groups that discussed evidence about learning analytics that can be added to the LACE Learning Analytics Evidence Hub.
Recordings of all the LACE SoLAR Flare presentations are available online.
On 16-17 September, I was in Graz with the Learning Analytics Community Exchange (LACE) . Before our consortium meeting, we held the 1st Learning Analytics Data Sharing Workshop. This brought people together from across Europe to discuss possibilities for data sharing.
The workshop was designed to act as a bridge between research and practical action. It also dealt with the technical, operational, business, policy and governance challenges involved with data sharing – with a particular focus on privacy issues.
The workshop was followed by a consortium meeting, and plans for developing this Europe-wide learning analytics community further.