On 23 January I presented at a joint symposium involving The Open University and the University of Gothenburg. Eleven participants from Gothenburg met with ten Open University researchers. Eight presentations, four from Gothenburg and four from The Open University allowed discussion of areas of mutual interest.
My presentation focused on what the UK research carried out by UK partners in the FutureLearn platform tells us. I presented a longer version of the talk to the FutureLearn Academic Network (FLAN) later in the week, so it is embedded in a later blog post.
Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.
The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.
Learning Analytics: Action List
Policy leadership and governance practices
- Develop common visions of learning analytics that address strategic objectives and priorities
- Develop a roadmap for learning analytics within Europe
- Align learning analytics work with different sectors of education
- Develop frameworks that enable the development of analytics
- Assign responsibility for the development of learning analytics within Europe
- Continuously work on reaching common understanding and developing new priorities
Institutional leadership and governance practices
- Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
- Develop practices that are appropriate to different contexts
- Develop and employ ethical standards, including data protection
Collaboration and networking
- Identify and build on work in related areas and other countries
- Engage stakeholders throughout the process to create learning analytics that have useful features
- Support collaboration with commercial organisations
Teaching and learning practices
- Develop learning analytics that makes good use of pedagogy
- Align analytics with assessment practices
Quality assessment and assurance practices
- Develop a robust quality assurance process to ensure the validity and reliability of tools
- Develop evaluation checklists for learning analytics tools
- Identify the skills required in different areas
- Train and support researchers and developers to work in this field
- Train and support educators to use analytics to support achievement
- Develop technologies that enable development of analytics
- Adapt and employ interoperability standards
Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.
Twitter identifies my top tweet, my top mention and my top media tweet. My followers appear to be most interested in globalised online learning.
‘Developing a strategic approach to MOOCs’ uses the work carried out at these universities to identify nine priority areas for MOOC research and how these can be developed in the future:
I was one of the editors of a special issue of the Journal of Interactive Media in Education (JIME) on Researching MOOCs. The special issue draws on the work of the FutureLearn Academic Network (FLAN), which is made up of academics st universities that are FutureLearn partners.
The special issue contains five papers.
On 14th December, Duygu Bektik defended her thesis successfully, and now only minor corrections stand between her and her doctorate.
Learning Analytics for Academic Writing through Automatic Identification of Meta-Discourse
When assessing student writing, tutors look for ability to present well-reasoned arguments, signalled by elements of meta-discourse. Some natural language processing systems can detect rhetorical moves in scholarly texts, but no previous work has investigated whether these tools can analyse student writing reliably. Duygu’s thesis evaluates the Xerox Incremental Parser (XIP), sets out ways in which it could be changed to support the analysis of student writing and proposes how its output could be delivered to tutors. It also investigates how tutors define the quality of undergraduate writing and identifies key elements that can be used to identify good student writing in the social sciences.
On 13 December, I joined a Foresight Workshop on Learning Technologies in Luxembourg. The workshop was designed to help the European Commission to set and define future European strategic research and innovation priorities.
The workshop began with a series of ‘Moonshots’. Individual experts presented ambitious, yet realistic, targets for EU-funded learning technology research and innovation up to 2025. For each of these, we considered: What is the problem? How is it dealt with now? What difference would it make if this problem were addressed successfully?
We went on to merge our individual Moonshots into Constellations and then into Galaxies. We made links between the different ideas, linking them with other international activities and trends, as well as to previous EU-funded work. I was interested to see that many of the experts from across Europe presented ideas associated with blockchain for learning, a pedagogy that was picked up in our recent Innovating Pedagogy report.
My moonshot focused on a series of problems: access to tertiary education is unequal, most people in Europe do not complete tertiary education and many people in Europe need to develop new skills. Massive open online courses (MOOCs) offer a potential solution, but these new approaches to learning require new approaches to teaching. Teachers need training and support to work effectively in these new environments. They also need proven models of good practice. Improving educator effectiveness on these courses has the potential to increase Europe’s capacity to respond to its priority areas. It also has the potential to open up education for millions by developing and sharing knowledge of how to teach at scale.