Archive for category Workshops
On 13 December, I joined a Foresight Workshop on Learning Technologies in Luxembourg. The workshop was designed to help the European Commission to set and define future European strategic research and innovation priorities.
The workshop began with a series of ‘Moonshots’. Individual experts presented ambitious, yet realistic, targets for EU-funded learning technology research and innovation up to 2025. For each of these, we considered: What is the problem? How is it dealt with now? What difference would it make if this problem were addressed successfully?
We went on to merge our individual Moonshots into Constellations and then into Galaxies. We made links between the different ideas, linking them with other international activities and trends, as well as to previous EU-funded work. I was interested to see that many of the experts from across Europe presented ideas associated with blockchain for learning, a pedagogy that was picked up in our recent Innovating Pedagogy report.
My moonshot focused on a series of problems: access to tertiary education is unequal, most people in Europe do not complete tertiary education and many people in Europe need to develop new skills. Massive open online courses (MOOCs) offer a potential solution, but these new approaches to learning require new approaches to teaching. Teachers need training and support to work effectively in these new environments. They also need proven models of good practice. Improving educator effectiveness on these courses has the potential to increase Europe’s capacity to respond to its priority areas. It also has the potential to open up education for millions by developing and sharing knowledge of how to teach at scale.
I visited the University of Deusto in Bilbao, Spain, to give a keynote at the learning analytics summer institute there (LASI Bilbao 2016) on 28 June 2016. The event brought people together from the Spanish Network of Learning Analytics (SNOLA), which was responsible for organising the event, in conjunction with the international Society for Learning Analytics Research (SoLAR).
What does the future hold for learning analytics? In terms of Europe’s priorities for learning and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as employability, active-citizenship and well-being. This is a tall order and, in order to achieve it, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can avoid potential pitfalls and develop an action plan that will drive the development of analytics that enhance both learning and teaching.
The series of LACE workshops on Ethics and Privacy in Learning Analytics (EP4LA) keeps expanding.
I worked with María Jésus Rodríguez-Triana on the programme for one of these events, which she ran with Denis Gillet at the 12th Joint European Summer School on Technology Enhanced Learning (JTEL Summer School) in Estonia, on 20 June.
This 90-minute workshop aims to give participants an overview of the ethical and privacy issues in Learning Analytics. Furthermore, the workshop allows the participants to increase the awareness about how to implement LA solutions either as researchers, practitioners or as developers. It will consist of three parts:
Part 1 – Introduction: presentation of LA frameworks and guidelines for Learning Analytics regarding ethics and privacy.
Part 2 – Framework analyses: participants will be grouped to work in a specific framework. The teams will categorise those ethical and privacy issues that the participants are currently addressing in their practice, those that could be covered with a low-medium effort, and those that constitute a challenge
Part 3 – Discussion: An open discussion will follow, exploring the complexity of each framework and looking for potential ways of addressing them.
Next stop after the LAK conference was Kuala Lumpur in Malaysia. There I took part in an expert workshop from 2-3 May 2016, organised by the Commonwealth of Learning, developing guidelines for the quality assurance and accreditation of massive open online courses.
The purpose of this review is to identify quality measures and to highlight some of the tensions surrounding notions of quality, as well as the need for new ways of thinking about and approaching quality in MOOCs. It draws on the literature on both MOOCs and quality in education more generally in order to provide a framework for thinking about quality and the different variables and questions that must be considered when conceptualising quality in MOOCs. The review adopts a relativist approach, positioning quality as a measure for a specific purpose. The review draws upon Biggs’s (1993) 3P model to explore notions and dimensions of quality in relation to MOOCs — presage, process and product variables — which correspond to an input–environment–output model. The review brings together literature examining how quality should be interpreted and assessed in MOOCs at a more general and theoretical level, as well as empirical research studies that explore how these ideas about quality can be operationalised, including the measures and instruments that can be employed. What emerges from the literature are the complexities involved in interpreting and measuring quality in MOOCs and the importance of both context and perspective to discussions of quality.
Australia: Adam Brimo
Japan: Paul Kawachi
New Zealand: Nina Hood
Our second LACE workshop of LAK16 was the highly successful Failathon. The idea for this workshop emerged from an overview of learning analytics evidence provided by the LACE Evidence Hub. This suggested that the published evidence is skewed towards positive results, so we set out to find out whether this is the case.
A packed workshop discussed past failures. All accounts were governed by the Chatham House Rule – they could be reported outside the workshop as long as the source of the information was neither explicitly or implicitly identified.
As in many fields, most papers in the learning analytics literature report success or, at least, read as if they are reporting success. This is almost certainly not because learning analytics research and activity are always successful. Generally, we report our successes widely, but keep our failures to ourselves. As Bismarck is alleged to have said: it is wise to learn from the mistakes of others. This workshop offers an opportunity for researchers and practitioners to share their failures in a lower-stakes environment, to help them learn from each other’s mistakes.
Clow, Doug; Ferguson, Rebecca; Macfadyen, Leah and Prinsloo, Paul (2016). LAK Failathon. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.
Fifty people attended the workshop, including invited experts (expert presentations), representatives of current European-funded projects in the field of learning analytics (project presentations), and representatives of the European Commission.
The workshop dealt with the current state of the art in learning analytics, the prospects for the implementation of learning analytics in the next decade, and the potential for European policy to guide and support the take-up and adaptation of learning analytics to enhance education.
The workshop began with a review of current learning analytics work by participants and went on to consider how learning analytics work can be taken forward in Europe (presentation on the LAEP project).
Participants at the workshop identified immediate issues for learning analytics in Europe. They set out considerations to be taken into account when developing learning analytics, made recommendations for learning analytics work in Europe and then identified both short- and long-term policy priorities in the area.
Immediate issues for LA in Europe
Framework for development: A European roadmap for learning analytics development would help us to build and develop a set of interoperable learning analytics tools that are tailored for the needs of Europe and that have been shown to work in practice.
Stakeholder involvement: There is a need to bring different people and stakeholders on board by reaching out to groups including teachers, students, staff, employers and parents. Our current engagement with stakeholders is too limited.
Data protection and surveillance: As legislation changes and individuals become more aware of data use, institutions need to understand their responsibilities and obligations with regard to data privacy and data protection
Empirical evidence and quality assurance: More empirical evidence is needed about the effects of learning analytics, in order to support a process of quality assurance.
Considerations for the development of LA
- Learning analytics can change or reinforce the status quo
- Learning analytics should enhance teaching, not replace it
- It is our duty to act upon the data we possess
- Desirable learning outcomes must be identified
- Be clear why we are collecting and analysing data
- Bring the data back to the learner
- Intelligent systems need human and cultural awareness
- Impressive data are not enough
Recommendations for LA work in Europe
- Undertake qualitative studies to understand how learning analytics can be aligned with the perceived purpose of education in different contexts, and which aspects of different educational contexts will support or constrain the use of learning analytics.
- Publicise existing evaluation frameworks for learning analytics and develop case studies that can be used to enrich and refine these frameworks
- Develop forms of quality assurance for learning analytics tools and for the evidence that is shared about these tools.
- Identify the limitations of different datasets and analytics and share this information clearly with end users.
- Explore ways of combining different datasets to increase the value of learning analytics for learners and teachers.
- Extend to different sectors of education the work currently being carried out in the higher education sector to identify the different elements that need to be taken into account when deploying learning analytics.
- Develop analytics, and uses for analytics, that delight and empower users.
Short-term policy priorities
Innovative pedagogy: Top priority is the need for novel, innovative pedagogy that drives innovation and the use of data to solve practical problems.
Evidence hub: Second priority is to secure continuing funding for a site that brings together evidence of what works and what does not in the field of learning analytics.
Data privacy: Participants considered that a clear statement is needed from privacy commissioners about controls to protect learners, teachers and society.
Orchestration of grants: The European grants system could better support the development of learning analytics if grants were orchestrated around an agreed reference model.
Crowd-sourced funding support: Set up a system for crowd-sourcing funding of tools teachers need, with EU top-up funding available for successful candidates.
21st-century skills: Focus on developing learning analytics for important skills and competencies that are difficult to measure, particularly 21st-century skills.
Open access standards: Standards need to be put into practice for analytics across Europe, with an open access forum that will enable the creation of standards from practice.
Ambassadors: We need more outreach, with ministries and politicians spreading the word and encouraging local communities and schools to engage.
Long-term policy priorities
Teacher education: Top priority in the longer term was for media competencies and learning analytics knowledge to be built into training for both new and existing teachers.
Decide which problems we want to solve: In order to develop the field of learning analytics we need to have collective discussions on the directions in which we want to go.
Facilitate data amalgamation: More consideration is needed of how to combine data sources to provide multi-faceted insights into the problems we seek to solve.
Identify success cases and methodologies that give us a solid foundation: We need a coordinated approach to quality assurance and to the identification of successful work.
Several accounts of the workshop are available online, dealing with the morning of day one, the afternoon of day one, day one as a whole, the morning of day two, the afternoon of day two and day two as a whole.
On 28 October I ran a pre-conference workshop at the 14th European Conference on e-Learning (held at the University of Hertfordshire) on ‘Learning design and learning analytics: building the links with MOOCs’.
To give a focus to the workshop, I aimed to choose a FutureLearn MOOC on a subject that everyone would know a little about and no one would know a lot about. As it was three days after the 600th anniversary of Agincourt (a famous battle in English history that fans of Shakespeare may know of through his play, Henry V) I picked the University of Southampton’s MOOC on the subject, ‘Agincourt 1415: Myth and Reality’.
I had reckoned without the international scope of the ECEL conference – I had picked on a subject that most of my audience knew nothing about, and that held little interest for them. Nevertheless, they bravely grappled with issues of learning design related to medieval muster rolls, ancient armour and the issue of whether war crimes existed before they were defined in law.
This hands-on workshop will work with learning design tools and with massive open online courses (MOOCs) on the FutureLearn platform to explore how learning design can be used to influence the choice and design of learning analytics. This workshop will be of interest to people who are involved in the design or presentation of online courses, and to those who want to find out more about learning design, learning analytics or MOOCs.