Archive for category Conferences
European Distance Learning Week kicked off today with a panel on the challenges and opportunities of innovation. The week is organised by the European Distance and E-learning Network (EDEN) in collaboration with the United States Distance Learning Association.
You can watch the panel here.
As one of the panelists, I talked about our work on the Innovating Pedagogy reports, identifying ten pedagogies each year that have the potential to change practice. This year’s report goes to the printers at the end of this week, and will be out on 7 December.
“At first glance, the speed of developments in Europe is overwhelming. Pre-existing conditions created in education established immense possibilities for innovations on the continent. Very complex and concise solutions are already in place. If we think about Open Education, we have a variety of forms on offer (MOOCs, OER, open online learning, virtual mobility, remote experiments and science education, to name a few), as well as regulations facilitating collaboration of education providers on all levels of education (Bologna process, credit transfer, prior and non-formal learning recognition).
“ET2020 open coordination groups already proved their important role in fostering developments in member states. The working group on Digital Skills and Competences addressed transversal issues and collaboration on innovation development and implementation through all levels of education. New instruments and tools were established to agree upon digitally competent organizations; citizens, teachers and learners can suggest new training schemes and certification possibilities, as well as recognition of digitally skilled employees in companies.
“The opening panel of EDLW addresses these speedy developments, unbundling solutions, micro, mezo, and macro level discussions and the complexity of Europe.”
Moderator: Airina Volungevičienė, EDEN President
- Sumathi Subramaniam, European Commission, DG Education, Youth, Sport and Culture, Innovation and EIT
- Brikena Xhomaqi, Director – Lifelong Learning Platform
- Rebecca Ferguson, Senior Lecturer, Institute of Educational Technology, The Open University
- Sharon Goldstein, Berkeley College Online
- Marci Powell, USDLA
- Timothy Read, Associate Pro-Vice Chancellor of Methodology & Technology, National Distance Education University (UNED), Spain
I received an open badge for my participation – an EDLW facilitator badge (below).
Together with Liz FitzGerald and Eileen Scanlon, I chaired the 38th annual conference of the Computers and Learning research group (CALRG), which took place at The Open University 16-18 June 2017. We enjoyed keynote presentations from Siân Bayne, Jenny Preece and Ben Shneiderman.
Full details of the conference, together with links to all the abstracts and to many of the presentations, are available on Cloudworks.
The third day of the conference was FutureLearn Academic Network day. This annual conference event prioritises the work of doctoral students within the FLAN Network. This year, it brought together presenters from Bath, Lancaster, Purdue (USA), Sheffield, Southampton, The Open University, and Warwick.
Our discussant was Professor Rupert Wegerif, University of Cambridge.
Members of FLAN can access the video of the event.
Scattered between my research presentations at LAK17 was my work as a member of the executive for the Society for Learning Analytics Research (SoLAR). The executive met daily during the conference – it is the only chance we have each year for face-to-face meetings. The LAK conferences also provide a venue for the AGM of the society and, despite the size of the room, where the AGM was held, it was standing room only for most of the meeting.
The executive also have a role to play in decisions about the conference itself, as well as acting as reviewers on the programme committee and chairs for the different sessions. Next year, at LAK18 in Vancouver, I shall be taking on a bigger role, as one of the programme chairs for the conference.
The picture shows me with half the SoLAR Executive at the post-LAK17 review meeting.
The European FP7-funded learning analytics community exchange (LACE) project came to an end last June. Since then, we have become a special interest group (SIG) of the Society for Learning Analytics Research (SoLAR) and we are now the learning analytics community Europe (LACE).
Although the loss of large-scale funding has meant scaling down our activities, we have still been active and our Twitter account reflects some of that work – including presentations on European learning analytics work in China, Japan and South Korea.
The LAK17 conference provided a chance for eight of the international team to get together and plan our next event, a workshop in our ethics and privacy in learning analytics series (EP4LA) that we are submitting to this year’s ECTEL conference.
Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.
We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.
Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.
Our main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.
The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.
Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.
Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.