Archive for category Analytics
Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.
The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.
Learning Analytics: Action List
Policy leadership and governance practices
- Develop common visions of learning analytics that address strategic objectives and priorities
- Develop a roadmap for learning analytics within Europe
- Align learning analytics work with different sectors of education
- Develop frameworks that enable the development of analytics
- Assign responsibility for the development of learning analytics within Europe
- Continuously work on reaching common understanding and developing new priorities
Institutional leadership and governance practices
- Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
- Develop practices that are appropriate to different contexts
- Develop and employ ethical standards, including data protection
Collaboration and networking
- Identify and build on work in related areas and other countries
- Engage stakeholders throughout the process to create learning analytics that have useful features
- Support collaboration with commercial organisations
Teaching and learning practices
- Develop learning analytics that makes good use of pedagogy
- Align analytics with assessment practices
Quality assessment and assurance practices
- Develop a robust quality assurance process to ensure the validity and reliability of tools
- Develop evaluation checklists for learning analytics tools
- Identify the skills required in different areas
- Train and support researchers and developers to work in this field
- Train and support educators to use analytics to support achievement
- Develop technologies that enable development of analytics
- Adapt and employ interoperability standards
Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.
Twitter identifies my top tweet, my top mention and my top media tweet. My followers appear to be most interested in globalised online learning.
Great to see this year’s Innovating Pedagogy 2016 report out. This report, which I co-author with others at The Open University, highlights ten trends that will impact education over the next decade. These include Design Thinking, Productive Failure, Formative Analytics and Translanguaging. The report also presents evidence to inform decisions about which pedagogies to adopt. The pedagogies range from ones already being tested in classrooms, such as learning through video games, to ideas for the future, like adapting blockchain technology for trading educational reputation.
This year, the report has been written in collaboration with the Learning Sciences Lab, National Institute of Education, Singapore.
The ten trends covered this year are:
- Learning through social media: Using social media to offer long-term learning opportunities
- Productive failure: Drawing on experience to gain deeper understanding
- Teachback: Learning by explaining what we have been taught
- Design thinking: Applying design methods in order to solve problems
- Learning from the crowd: Using the public as a source of knowledge and opinion
- Learning through video games: Making learning fun, interactive and stimulating
- Formative analytics: Developing analytics that help learners to reflect and improve
- Learning for the future: Preparing students for work and life in an unpredictable future
- Translanguaging: Enriching learning through the use of multiple languages
- Blockchain for learning: Storing, validating and trading educational reputation
The PELARS project (Practice-based Experiential Learning Analytics Research And Support) invited me to Brussels for their Policies for using Big Data event on 9 November. The aim of the workshop was to raise awareness about the potential of analysis of data produced by learning technologies to catalyze the effective design of adaptive teaching, learning and assessment at scale. The aim was to bring together people interested in exploring the state-of-the-art of learning analytics, as well as to be informed about opportunities and barriers for adoption.
I chaired the panel discussion at the event, and was also able to talk to participants about the LACE project, following a presentation on LACE by Hendrik Drachsler.
Il-Hyun talked about the problems associated with learning analytics in a country where grades are allocated in relation to a normal distribution curve – so if one student’s grades go up, another student’s grades will go do – and where competition to enter universities is so intense that retention is not viewed as a problem.
While I was in Seoul in September, I took part in the Asian Learning Analytics Summer Institute (LASI Asia). I was joined there by members of the LACE team, who included the event as part of the LACE tour of Asia, which also took in Japan and Korea.
During LASI Asia, I gave a talk about what is on the horizon for learning analytics. This went into more detail, and was aimed at a more specialist audience, than my talk at e-Learning Korea. I also took part in a couple of panel discussions. The first was on how to build an international community on learning analytics research, and the second was on the achievements of learning analytics research and next steps.
There is general agreement that the importance of learning analytics is likely to increase in the coming decade. However, little guidance for policy makers has been forthcoming from the technologists, educationalists and teachers who are driving the development of learning analytics. The Visions of the Future study was carried out by the LACE project in to order to provide some perspectives that could feed into the policy process.
The study took the form of a ‘policy Delphi’, which is to say that it was not concerned with certainty about the future, but rather focused on understanding the trends issues which will be driving the field forward in the coming years. The project partners developed eight visions of the future of learning analytics in 2025. These visions were shared with invited experts and LACE contacts through an online questionnaire, and consultation with stakeholders was carried out at events. Respondents were asked to rate the visions in terms of their feasibility and desirability, and the actions which should be taken in the light of their judgements. 487 responses to visions were received from 133 people. The views of the respondents on how the future may evolve are both interesting and entertaining. More significantly, analysis of the ratings and free text responses showed that for the experts and practitioners who engaged in the study, there was a consensus around a number of points which are shaping the future of learning analytics.
1. There is a lot of enthusiasm for Learning Analytics, but concern that its potential will not be fulfilled. It is therefore appropriate for policy makers to take a role.
2. Policies and infrastructure are necessary to strengthen the rights of the data subject.
3. Interoperability specifications and open infrastructures are an essential enabling technology. These can support the rights of the data subject, and ensure control of analytics processes at the appropriate level.
4. Learning analytics should not imply automation of teaching and learning.
The full results of the study are published in a report at http://www.laceproject.eu/deliverables/d3-2-visions-of-the-future-2/.
In this session the visions explored by the LACE study will be presented, the conclusions discussed, and the audience will take part in an impromptu mapping of the most desirable and feasible vision of the future for learning analytics in Asia.
Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and early adopters around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in April 2006, we had begun developing learning analytics for 2016, we might not have planned specifically for learning with and through social networks (Twitter was launched in July 2006), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. In this talk, Rebecca will introduce a range of different scenarios that explore different ways in which learning analytics could develop in the future. She will share the results of an international Policy Delphi study, which was designed for the systematic solicitation and collation of informed judgments on visions of learning analytics in 2025. The study explored underlying assumptions and information leading to differing judgments on learning analytics, and brought together informed judgments about the field. The findings of the Policy Delphi, together with other studies, are now being used to develop action plans that will help us to develop analytics to support learners and educators in the future.
While at the conference, I also took part in a panel discussion entitled, ‘Next step of ‘e-Learning for Smart, Connected World’