Archive for category Publications

Enhancing Learning and Teaching with Technology

Book coverEnhancing Learning and Teaching with Technology: What the Research Says was published by Institute of Education Press on 24 January 2018. The book was inspired by a seminar series run at the Institute that focused on research findings about educational technology. It was officially launched at the BETT Show in London.

Our chapter focuses on MOOCs and was based on the research-based publications of UK-based partners of the FutureLearn platform.

Introduction to the chapter

Free online courses that provide learning at scale have the potential to open up education around the world. MOOCs now engage millions of learners.  For example, FutureLearn, the UK’s largest MOOC provider passed 6m registered learners in 2017, 75% of these outside the UK.  Coursera, the world’s largest platform, claimed 24m learners worldwide in March 2017, of which over half a million were UK learners.

In this section, we explore what the research tells us about how MOOCs need to be developed in order to help provide education for all. This research was carried out at UK universities partnered with the FutureLearn MOOC platform. When it was carried out, FutureLearn had 64 university partners, including 29 within the UK, all linked by the FutureLearn Academic Network (FLAN).

Reference

Ferguson, Rebecca; Herodotou, Christothea; Coughlan, Tim; Scanlon, Eileen and Sharples, Mike (2018). MOOC development: priority areas. In: Luckin, Rosemary ed. Enhancing Learning and Teaching with Technology: What the Research Says. London: UCL IOE Press.

Advertisements

Leave a comment

Innovating Pedagogy 2017

Cover of Innovating Pedagogy 2017On 7 December 2018 we launched Innovating Pedagogy 2017. This is the sixth in a series of reports that explores new forms of teaching, learning and assessment. It is the first of the series on which I have been lead author, taking over from Mike Sharples who initiated the series and remains an author. This year, the report was produced by The Open University in collaboration with the Learning In a NetworKed Society (LINKS) Israeli Center of Research Excellence (I-CORE).

All the Innovating Pedagogy reports are released under a Creative Commons licence and can be downloaded free of charge.

The ten innovative pedagogies proposed in this year’s report are:

  •     Big-data inquiry: thinking with data
  •     Learners making science
  •     Navigating post-truth societies
  •     Immersive learning
  •     Learning with internal values
  •     Student-led analytics
  •     Intergroup empathy
  •     Humanistic knowledge-building communities
  •     Open textbooks
  •     Spaced Learning

Tweet about Innovating PedagogyOur fellow authors at LINKS worked on a translation, and a Hebrew version of the report is now available to download from the Innovating Pedagogy website.

Cover of Hebrew version

 

Leave a comment

Learning analytics: where is the evidence?

IMG_3917.jpgOur main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.

The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.

Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.

Abstract

Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.

Leave a comment

LAK17: Failathon

Screen Shot 2017-03-31 at 10.20.18Monday 13 March was the day of the second LAK Failathon, this time held at the LAK17 conference at Simon Fraser University in Vancouver. This year, we took the theme ‘Beyond Failure’ and the workshop led into a paper later in the conference and then to a crowd-sourced paper on how we can work to avoid failure both on individual projects and across the learning analytics community as a whole.

We also took a consciously international approach, and so workshop leaders included Doug Clow and I from Europe, Mike Sharkey from North America, Cecilia Aguerrebere from South AMerica, Kirsty Kitto from Australia and Yong-Sang Cho from Asia.

Clow, Doug; Ferguson, Rebecca; Kitto, Kirsty; Cho, Yong-Sang; Sharkey, Mike and Aguerrebere, Cecilia (2017). Beyond failure: the 2nd LAK Failathon. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 504–505.

If you can’t access the workshop outline behind the paywall, contact me for a copy.

Abstract

The 2nd LAK Failathon will build on the successful event in 2016 and extend the workshop beyond discussing individual experiences of failure to exploring how the field can improve, particularly regarding the creation and use of evidence. Failure in research is an increasingly hot topic, with high-profile crises of confidence in the published research literature in medicine and psychology. Among the major factors in this research crisis are the many incentives to report and publish only positive findings. These incentives prevent the field in general from learning from negative findings, and almost entirely preclude the publication of mistakes and errors. Thus providing an alternative forum for practitioners and researchers to learn from each other’s failures can be very productive. The first LAK Failathon, held in 2016, provided just such an opportunity for researchers and practitioners to share their failures and negative findings in a lower-stakes environment, to help participants learn from each other’s mistakes. It was very successful, and there was strong support for running it as an annual event. This workshop will build on that success, with twin objectives to provide an environment for individuals to learn from each other’s failures, and also to co-develop plans for how we as a field can better build and deploy our evidence base.

,

Leave a comment

CBA: impact on student engagement

Our new paper is now – lead author Quan Nguyen – is available online in Computers in Human Behavior. It examines the designs of computer-based assessment and its impact on student engagement, student satisfaction and pass rates.

Computers in Behavior is locked behind a paywall, so contact me for a copy if you can’t get access to the paper.

Abstract

Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students’ engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students’ time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.

Nguyen, Quan; Rienties, Bart; Toetenel, Lisette; Ferguson, Rebecca and Whitelock, Denise (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior (Early access).

Leave a comment

Innovating pedagogy: China

The Innovating Pedagogy 2016 report. Now in Chinese.

First page in Chinese

Leave a comment

Dimensions of personalisation in TEL

Tweet about the paperNew paper out in the British Journal of Educational Technology, co-authored with a host of people. Lead author Liz FitzGerald plus Natalia Kucirkova, Ann Jones, Simon Cross, Thea Herodotou, Garron Hillaire and Eileen Scanlon.

The framework proposed in the paper has six dimensions:

  1. what is being personalised
  2. type of learning
  3. personal characteristics of the learner
  4. who/what is doing the personalisation
  5. how personalisation is carried out
  6. impact / beneficiaries

Abstract

Personalisation of learning is a recurring trend in our society, referred to in government speeches, popular media, conference and research papers and technological innovations. This latter aspect – of using personalisation in technology-enhanced learning (TEL) – has promised much but has not always lived up to the claims made. Personalisation is often perceived to be a positive phenomenon, but it is often difficult to know how to implement it effectively within educational technology.

In order to address this problem, we propose a framework for the analysis and creation of personalised TEL. This article outlines and explains this framework with examples from a series of case studies. The framework serves as a valuable resource in order to change or consolidate existing practice and suggests design guidelines for effective implementations of future personalised TEL.

FitzGerald, Elizabeth; Kucirkova, Natalia; Jones, Ann; Cross, Simon; Ferguson, Rebecca; Herodotou, Christothea; Hillaire, Garron and Scanlon, Eileen (2017). Dimensions of personalisation in technology-enhanced learning: a framework and implications for design. British Journal of Educational Technology (early view).

Leave a comment