Archive for category Funding

Research Evidence on the Use of Learning Analytics: Implications for Education Policy

Report coverThe final report on our study of learning analytics for European educational policy (LAEP) is now out.

Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.

The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.

Learning Analytics: Action List

Policy leadership and governance practices

  • Develop common visions of learning analytics that address strategic objectives and priorities
  • Develop a roadmap for learning analytics within Europe
  • Align learning analytics work with different sectors of education
  • Develop frameworks that enable the development of analytics
  • Assign responsibility for the development of learning analytics within Europe
  • Continuously work on reaching common understanding and developing new priorities

Institutional leadership and governance practices

  • Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
  • Develop practices that are appropriate to different contexts
  • Develop and employ ethical standards, including data protection

Collaboration and networking

  • Identify and build on work in related areas and other countries
  • Engage stakeholders throughout the process to create learning analytics that have useful features
  • Support collaboration with commercial organisations

Teaching and learning practices

  • Develop learning analytics that makes good use of pedagogy
  • Align analytics with assessment practices

Quality assessment and assurance practices

  • Develop a robust quality assurance process to ensure the validity and reliability of tools
  • Develop evaluation checklists for learning analytics tools

Capacity building

  • Identify the skills required in different areas
  • Train and support researchers and developers to work in this field
  • Train and support educators to use analytics to support achievement


  • Develop technologies that enable development of analytics
  • Adapt and employ interoperability standards

Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.

, ,

Leave a comment

Funded studentship opportunity – researching MOOCs at the OU

MOOC educator poster

The Open University is advertising six Leverhulme doctoral scholarships in open world learning with a closing date for applications of Monday 9 March 2015. These are full-time, fully funded studentships, leading to a PhD.

One of the named topics is ‘Educator roles in open online courses‘ and the description is:

“What roles do educators play in massive open online courses (MOOCs)? How can they be most effective in supporting learners to achieve their learning goals? In these open online settings, teaching is carried out by a team of educators, including academic lead, course presenter, moderator, facilitator and the learners themselves. These roles are still being developed, and there is a pressing need to identify evidence-based good practice. The successful candidate will use data from a range of MOOCs to answer the questions above, and will have opportunities to work with the FutureLearn Academic Network, an international team of MOOC researchers.”

If you are interested in applying, you need to provide a short research proposal explaining how this area fits the overall theme of Open World Learning and how you intend to conduct research on the topic selected. See the website for more specific details about applying.

When putting together an application, you may find it useful to take a look at these two papers: Taking on different roles: how educators position themselves in MOOCs and Innovative pedagogy at massive scale: teaching and learning in MOOCs.


, , , , , ,

Leave a comment

Learning analytics data sharing workshop

Concept mapping at the LACE workshop

Concept mapping at the LACE workshop

On 16-17 September, I was in Graz with the Learning Analytics Community Exchange (LACE) . Before our consortium meeting, we held the 1st Learning Analytics Data Sharing Workshop. This brought people together from across Europe to discuss possibilities for data sharing.

The workshop was designed to act as a bridge between research and practical action. It also dealt with the technical, operational, business, policy and governance challenges involved with data sharing – with a particular focus on privacy issues.

The workshop was followed by a consortium meeting, and plans for developing this Europe-wide learning analytics community further.

, , ,

Leave a comment

LACE – Learning Analytics Community Exchange

The LACE team

The LACE team

From 20-22 January, I was in Brussels for the kick-off meeting of the Learning Analytics Community Exchange (LACE).

The LACE project brings together existing key European players in the field of learning analytics and educational data mining (EDM), who are committed to build communities of practice and share emerging best practice in order to make progress towards four objectives:

1. Promote knowledge creation and exchange
2. Increase the evidence base
3. Contribute to the definition of future directions
4. Build consensus on interoperability and data sharing

This will involve organising a range of activities designed to integrate people carrying out or making use of learning analytics and ED research and development. LACE will also develop an ‘evidence hub’ that will bring together a knowledge base of evidence in the field. Members will also explore plausible futures for the field.

Core partners

Open Universiteit Nederland, Netherlands
Cetis, the Centre for Educational Technology and Interoperability Standards at the University of Bolton, UK
The Open University, UK
Infinity Technology Solutions, Italy
Skolverket, the Swedish National Agency for Education, Sweden
Kennisnet, Netherlands
Høgskolen i Oslo og Akershus, Norway
ATiT, Audiovisual Technologies, Informatics and Telecommunications, Belgium
EDEN, the European Distance Education Network, Hungary

, ,

Leave a comment

LASI: Panel on analytics for 21st-century skills

Twitter comment about the presentationsScreenshot of the panel in progressWhile at the Learning Analytics Summer Institute at Stanford, I participated in a panel on Analytics for 21st-century Skills. The panel was chaired by Caroline Haythornthwaite and my fellow panellists were Ruth Deakin-Crick (University of Bristol) and Peter Foltz (Pearson).

My section of the panel focused on our work with EnquiryBlogger. This tool, built on the WordPress blogging platform, can be used to help structure knowledge construction, and to reflect on the emotions and dispositions that form part of the learning process.

You can read Doug Clow’s liveblog of the panel – one of a series of posts covering the whole of LASI13 – or watch a replay of the event.

, , ,

Leave a comment

LASI: Workshop on social learning analytics

I spent last week in California at Center for Educational Research at Stanford (CERAS), attending the Learning Analytics Summer Institute.  This was a strategic five-day event, July 1-5, 2013, co-organized by SoLAR and Stanford University. The twin objectives of the event were to build the field of learning analytics and to develop the skills and knowledge of participants so that they can go on to research and teach in the field.

Together with Caroline Haythornthwaite, Stephanie Teasley, Shane Dawson and Dan Suthers, I ran an afternoon workshop on Social Learning Analytics. My section focused on discourse analytics and disposition analytics.

I returned to a recurrent theme of my analytics presentations – don’t start with the data, start with the pedagogy. In this case, starting points could be:

  • How do people learn socially and in social situations?
  • How can we use big data to facilitate that process?

Our workshop discussion focused on how social learning analytics might be implemented in a physical space. My presentation makes mention of a research project in the building where I work, which influenced people’s behaviour in the building through the use of twinkly lights and real-time displays of behavioural data. If you are interested in finding out more about this, the researchers have published more about it.

, , , ,

1 Comment

Checking grant bids

This week I shall be one of the presenters at a university Bidding for Funding workshop on ‘Building your track record with funders’. Below is the checklist for grant applications that I will be circulating to participants

Quick-check questions

  1. What story am I telling?
  2. Who is the audience?
  3. Why does it matter?
  4. Why now?
  5. Why me / us?

Have I:

  • Formulated the problem clearly?
  • Established appropriate intellectual aims?
  • Set the problem in the context of contemporary scientific and theoretical debates?
  • Explained what the research will do – to whom or what – and why?
  • Justified my selection of staff and / or collaborators?
  • Demonstrated the ways in which this work will build on existing research?
  • Clearly and concisely set out appropriate, practical and attainable aims / objectives?
  • Shown how my research will relate to and deliver these aims and objectives?
  • Developed a well thought-out research design in which there is a reasoned and realistic explanation of the scale, timing and resources necessary?
  • Provided a full and detailed description of the proposed research methods?
  • Defended my research design and shown why others are not appropriate?
  • Highlighted any innovation in the methodology I am planning to use?
  • Justified the quality, validity, reliability and relevance of this research?
  • Considered the possibility of using existing data sources?
  • Set out a clear and systematic approach to the analysis of data?
  • Shown how my approach to analysis fits the research design?
  • Thought about the ethics of what I plan to do?
  • Addressed any sensitive issues or potential problems?
  • Fully consulted on these issues and obtained approval if required?
  • Provided written confirmation that access will be given where necessary?
  • Identified and planned for the skills and competencies required?
  • Highlighted potential difficulties and discussed how they will be handled?
  • Demonstrated the ways in which this research will make a contribution to the area?
  • Identified people outside the academic community who might use this research?
  • Involved / consulted potential users of this research?
  • Arranged for those users to continue to be involved in an appropriate way?
  • Explained why this research will be of interest to this funder?
  • Investigated possibilities for co-funding the research?
  • Provided a good quality, up-to-date bibliography?
  • Provided a clear dissemination strategy that will engage all interested parties?
  • Demonstrated ways in which my research will make an impact?
  • Considered ways of making my data and my publications open access?
  • Checked the spelling, grammar and style of my proposal?
  • Identified potential referees, and justified their selection?
  • Conveyed my genuine interest in, understanding of and enthusiasm for the research?

, , , , , , , , ,

Leave a comment