Archive for category Educational Futures

Vital learning analytics

On 17 July, I presented at ‘Analytics in learning and teaching: the role of big data, personalized learning and the future of the teacher’.

This event was held at the University of Central Lancashire (UCLAN) in Preson, and was organised by the VITAL project (Visualisation tools and analytics to monitor language learning and teaching).

My talk was on ‘Learning analytics: planning for the future’.

Abstract

What does the future hold for learning analytics? In terms of Europe’s current priorities for education and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should help improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as linguistic abilities, cultural awareness and active-citizenship. This is a challenging agenda that requires us to look beyond our immediate priorities and institutional goals. In order to address this agenda, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can develop an action plan that will drive the development of analytics that enhance both learning and teaching.

Leave a comment

LAK Failathon poster

IMG_2540Our LAK Failathon workshop at the start of LAK 17 generated the basic ideas for a poster on how the field of learning analytics can increase its evidence base and avoid failure.

We took the poster to the LAK17 Firehose session, where Doug Clow provided a lightning description of it, and we then used the poster to engage people in discussion about the future of the field.

Despite the low production quality of the poster (two sheets of flip chart paper, some post-it notes and a series of stickers to mark agreement) its interactive quality obviously appealed to participants and we won best poster award. :-)

Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond Failure: The 2nd LAK Failathon Poster. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 540–541.

 

,

Leave a comment

LAK17: Failathon

Screen Shot 2017-03-31 at 10.20.18Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.

We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.

Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.

If you can’t access the workshop outline behind the paywall, contact me for a copy.

Abstract

The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.

,

Leave a comment

Innovating pedagogy: China

The Innovating Pedagogy 2016 report. Now in Chinese.

First page in Chinese

Leave a comment

Barcelona: FutureLearn Academic Network

Tweet about the eventOn 27 January, I travelled to Pompeu Fabra university in Barcelona for a meeting of the FutureLearn Academic Network (FLAN) on The Educator Experience. This was the first FLAN meeting to take place outside the UK and it was held at UPF’s Poblenou Campus. The event was organised by CLIK (Center for Learning, Innovation and Knowledge) and the members of the Educational Technologies section within the Interactive Technologies Research Group of UPF.

During the meeting, FutureLearn partners reflected on the impact and research possibilities of MOOC in the field of education. Sir Timothy O’Shea, Principal and Vice-Chancellor of the University of Edinburgh, gave the keynote speech, describing Edinburgh’s developing MOOC strategy, including the production of 64 online master’s courses.

I talked about our recent report MOOCs; What the Research of FutureLearn’s UK Partners Tells Us

If you have access to the FutureLearn Partners’ blog, a video of the meeting and summary notes of the sessions are available.

,

Leave a comment

BETT 2017: learning analytics

Rebecca talking at BETTOn 25 January, I presented at the BETT trade show on An action plan for learning analytics. If you would like to introduce learning analytics at your institution, where should you start? Drawing on recent studies that consulted experts worldwide, I outlined an action plan for analytics and identified the key points to keep in mind.

My talk formed part of the HE Leaders Summit, a section of the event that was designed to address some of the most significant challenges currently facing senior leaders across Higher Education.

Leave a comment

Research Evidence on the Use of Learning Analytics: Implications for Education Policy

Report coverThe final report on our study of learning analytics for European educational policy (LAEP) is now out.

Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.

The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.

Learning Analytics: Action List

Policy leadership and governance practices

  • Develop common visions of learning analytics that address strategic objectives and priorities
  • Develop a roadmap for learning analytics within Europe
  • Align learning analytics work with different sectors of education
  • Develop frameworks that enable the development of analytics
  • Assign responsibility for the development of learning analytics within Europe
  • Continuously work on reaching common understanding and developing new priorities

Institutional leadership and governance practices

  • Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
  • Develop practices that are appropriate to different contexts
  • Develop and employ ethical standards, including data protection

Collaboration and networking

  • Identify and build on work in related areas and other countries
  • Engage stakeholders throughout the process to create learning analytics that have useful features
  • Support collaboration with commercial organisations

Teaching and learning practices

  • Develop learning analytics that makes good use of pedagogy
  • Align analytics with assessment practices

Quality assessment and assurance practices

  • Develop a robust quality assurance process to ensure the validity and reliability of tools
  • Develop evaluation checklists for learning analytics tools

Capacity building

  • Identify the skills required in different areas
  • Train and support researchers and developers to work in this field
  • Train and support educators to use analytics to support achievement

Infrastructure

  • Develop technologies that enable development of analytics
  • Adapt and employ interoperability standards

Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.

, ,

Leave a comment