Archive for category Events
Scattered between my research presentations at LAK17 was my work as a member of the executive for the Society for Learning Analytics Research (SoLAR). The executive met daily during the conference – it is the only chance we have each year for face-to-face meetings. The LAK conferences also provide a venue for the AGM of the society and, despite the size of the room, where the AGM was held, it was standing room only for most of the meeting.
The executive also have a role to play in decisions about the conference itself, as well as acting as reviewers on the programme committee and chairs for the different sessions. Next year, at LAK18 in Vancouver, I shall be taking on a bigger role, as one of the programme chairs for the conference.
The picture shows me with half the SoLAR Executive at the post-LAK17 review meeting.
The European FP7-funded learning analytics community exchange (LACE) project came to an end last June. Since then, we have become a special interest group (SIG) of the Society for Learning Analytics Research (SoLAR) and we are now the learning analytics community Europe (LACE).
Although the loss of large-scale funding has meant scaling down our activities, we have still been active and our Twitter account reflects some of that work – including presentations on European learning analytics work in China, Japan and South Korea.
The LAK17 conference provided a chance for eight of the international team to get together and plan our next event, a workshop in our ethics and privacy in learning analytics series (EP4LA) that we are submitting to this year’s ECTEL conference.
Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.
We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.
Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.
Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.
The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.
A very busy week in Vancouver at the LAK17 (learning analytics and knowledge) conference kicked off with the all-day doctoral consortium on 14 March (funded by SoLAR and the NSF). I joined Bodong Chen and Ani Aghababyan as an organiser this year and we enjoyed working with the ten talented doctoral students from across the world who gained a place in the consortium.
- Alexander Whitelock-Wainwright: Students’ intentions to use technology in their learning: The effects of internal and external conditions
- Alisa Acosta: The design of learning analytics to support a knowledge community and inquiry approach to secondary science
- Daniele Di Mitri: Digital learning shadow: digital projection, state estimation and cognitive inference for the learning self
- Danielle Hagood: Learning analytics in non-cognitive domains
- Justian Knobbout: Designing a learning analytics capabilities model
- Leif Nelson: The purpose of higher education in the discourse of learning analytics
- Quan Nguyen: Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
- Stijn Van Laer: Design guidelines for blended learning environments to support self-regulation: event sequence analysis for investigating learners’ self-regulatory behavior
- Tracie Farrell Frey: Seeking relevance: affordances of learning analytics for self-regulated learning
- Ye Xiong: Write-and-learn: promoting meaningful learning through concept map-based formative feedback on writing assignments
The intention of the doctoral consortium was to support and inspire doctoral students in their ongoing research efforts. The objectives were to:
- Provide a setting for mutual feedback on participants’ current research and guidance on future research directions from a mentor panel
- Create a forum for engaging in dialogue aimed at building capacity in the field with respect to current issues in learning analytics ranging from methods of gathering analytics, interpreting analytics with respect to learning issues, considering ethical issues, relaying the meaning of analytics to impact teaching and learning, etc.
- Develop a supportive, multidisciplinary community of learning analytics scholars
- Foster a spirit of collaborative research across countries, institutions and disciplinary background
- Enhance participating students’ conference experience by connecting participants to other LAK attendees
Just back from a couple of trips to Luxembourg, where I was one of the team carrying out final reviews for the Lea’s Box and Eco projects. This was my third year reviewing Lea’s Box, but I only joined the Eco team for their final review.
Lea’s Box was ‘a 3-year research and development project (running from March 2014 to [January] 2016) funded by the European Commission. The project focussed on (a) making educational assessment and appraisal more goal-oriented, proactive, and beneficial for students, and (b) on enabling formative support of teachers and other educational stakeholders on a solid basis of a wide range of information about learners.’
Eco was ‘a European project based on Open Educational Resources (OER) that gives free access to a list of MOOC (Massive Open Online Courses) in 6 languages […] The main goal of this project is to broaden access to education and to improve the quality and cost-effectiveness of teaching and learning in Europe.’