Archive for category Educational Futures
Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.
We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.
Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.
Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.
We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.
Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.
If you can’t access the workshop outline behind the paywall, contact me for a copy.
The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.
The Innovating Pedagogy 2016 report. Now in Chinese.
On 27 January, I travelled to Pompeu Fabra university in Barcelona for a meeting of the FutureLearn Academic Network (FLAN) on The Educator Experience. This was the first FLAN meeting to take place outside the UK and it was held at UPF’s Poblenou Campus. The event was organised by CLIK (Center for Learning, Innovation and Knowledge) and the members of the Educational Technologies section within the Interactive Technologies Research Group of UPF.
During the meeting, FutureLearn partners reflected on the impact and research possibilities of MOOC in the field of education. Sir Timothy O’Shea, Principal and Vice-Chancellor of the University of Edinburgh, gave the keynote speech, describing Edinburgh’s developing MOOC strategy, including the production of 64 online master’s courses.
I talked about our recent report MOOCs; What the Research of FutureLearn’s UK Partners Tells Us
If you have access to the FutureLearn Partners’ blog, a video of the meeting and summary notes of the sessions are available.
On 25 January, I presented at the BETT trade show on An action plan for learning analytics. If you would like to introduce learning analytics at your institution, where should you start? Drawing on recent studies that consulted experts worldwide, I outlined an action plan for analytics and identified the key points to keep in mind.
My talk formed part of the HE Leaders Summit, a section of the event that was designed to address some of the most significant challenges currently facing senior leaders across Higher Education.
Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.
The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.
Learning Analytics: Action List
Policy leadership and governance practices
- Develop common visions of learning analytics that address strategic objectives and priorities
- Develop a roadmap for learning analytics within Europe
- Align learning analytics work with different sectors of education
- Develop frameworks that enable the development of analytics
- Assign responsibility for the development of learning analytics within Europe
- Continuously work on reaching common understanding and developing new priorities
Institutional leadership and governance practices
- Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
- Develop practices that are appropriate to different contexts
- Develop and employ ethical standards, including data protection
Collaboration and networking
- Identify and build on work in related areas and other countries
- Engage stakeholders throughout the process to create learning analytics that have useful features
- Support collaboration with commercial organisations
Teaching and learning practices
- Develop learning analytics that makes good use of pedagogy
- Align analytics with assessment practices
Quality assessment and assurance practices
- Develop a robust quality assurance process to ensure the validity and reliability of tools
- Develop evaluation checklists for learning analytics tools
- Identify the skills required in different areas
- Train and support researchers and developers to work in this field
- Train and support educators to use analytics to support achievement
- Develop technologies that enable development of analytics
- Adapt and employ interoperability standards
Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.
‘Developing a strategic approach to MOOCs’ uses the work carried out at these universities to identify nine priority areas for MOOC research and how these can be developed in the future: