Archive for category Papers

Learning analytics: where is the evidence?

IMG_3917.jpgOur main paper at the LAK conference looked at the state of evidence in the field. Drawing on the work collated in the LACE project Evidence Hub, it seems that there is, as yet, very little clear evidence that learning analytics improve learning or teaching. The paper concludes with a series of suggestions about how we can work as a community to improve the evidence base of the field.

The room was full to overflowing for our talk and for the other two talks in the session on the ethics of learning analytics. If you weren’t able to get in and you want to understand the links between jelly beans, a dead salmon, Bob Dylan, Buffy the Vampire Slayer and learning analytics, I shall share the link to the recorded session as soon as I have it.

Ferguson, Rebecca and Clow, Doug (2017). Where is the evidence? A call to action for learning analytics. In: LAK ’17 Proceedings of the Seventh International Learning Analytics & Knowledge Conference, ACM International Conference Proceeding Series, ACM, New York, USA, pp. 56–65.

Abstract

Where is the evidence for learning analytics? In particular, where is the evidence that it improves learning in practice? Can we rely on it? Currently, there are vigorous debates about the quality of research evidence in medicine and psychology, with particular issues around statistical good practice, the ‘file drawer effect’, and ways in which incentives for stakeholders in the research process reward the quantity of research produced rather than the quality. In this paper, we present the Learning Analytics Community Exchange (LACE) project’s Evidence Hub, an effort to relate research evidence in learning analytics to four propositions about learning analytics: whether they support learning, support teaching, are deployed widely, and are used ethically. Surprisingly little evidence in this strong, specific sense was found, and very little was negative (7%, N=123), suggesting that learning analytics is not immune from the pressures in other areas. We explore the evidence in one particular area in detail (whether learning analytics improve teaching and learners support in the university sector), and set out some of the weaknesses of the evidence available. We conclude that there is considerable scope for improving the evidence base for learning analytics, and set out some suggestions of ways for various stakeholders to achieve this.

Advertisements

Leave a comment

CBA: impact on student engagement

Our new paper is now – lead author Quan Nguyen – is available online in Computers in Human Behavior. It examines the designs of computer-based assessment and its impact on student engagement, student satisfaction and pass rates.

Computers in Behavior is locked behind a paywall, so contact me for a copy if you can’t get access to the paper.

Abstract

Many researchers who study the impact of computer-based assessment (CBA) focus on the affordances or complexities of CBA approaches in comparison to traditional assessment methods. This study examines how CBA approaches were configured within and between modules, and the impact of assessment design on students’ engagement, satisfaction, and pass rates. The analysis was conducted using a combination of longitudinal visualisations, correlational analysis, and fixed-effect models on 74 undergraduate modules and their 72,377 students. Our findings indicate that educators designed very different assessment strategies, which significantly influenced student engagement as measured by time spent in the virtual learning environment (VLE). Weekly analyses indicated that assessment activities were balanced with other learning activities, which suggests that educators tended to aim for a consistent workload when designing assessment strategies. Since most of the assessments were computer-based, students spent more time on the VLE during assessment weeks. By controlling for heterogeneity within and between modules, learning design could explain up to 69% of the variability in students’ time spent on the VLE. Furthermore, assessment activities were significantly related to pass rates, but no clear relation with satisfaction was found. Our findings highlight the importance of CBA and learning design to how students learn online.

Nguyen, Quan; Rienties, Bart; Toetenel, Lisette; Ferguson, Rebecca and Whitelock, Denise (2017). Examining the designs of computer-based assessment and its impact on student engagement, satisfaction, and pass rates. Computers in Human Behavior (Early access).

Leave a comment

Dimensions of personalisation in TEL

Tweet about the paperNew paper out in the British Journal of Educational Technology, co-authored with a host of people. Lead author Liz FitzGerald plus Natalia Kucirkova, Ann Jones, Simon Cross, Thea Herodotou, Garron Hillaire and Eileen Scanlon.

The framework proposed in the paper has six dimensions:

  1. what is being personalised
  2. type of learning
  3. personal characteristics of the learner
  4. who/what is doing the personalisation
  5. how personalisation is carried out
  6. impact / beneficiaries

Abstract

Personalisation of learning is a recurring trend in our society, referred to in government speeches, popular media, conference and research papers and technological innovations. This latter aspect – of using personalisation in technology-enhanced learning (TEL) – has promised much but has not always lived up to the claims made. Personalisation is often perceived to be a positive phenomenon, but it is often difficult to know how to implement it effectively within educational technology.

In order to address this problem, we propose a framework for the analysis and creation of personalised TEL. This article outlines and explains this framework with examples from a series of case studies. The framework serves as a valuable resource in order to change or consolidate existing practice and suggests design guidelines for effective implementations of future personalised TEL.

FitzGerald, Elizabeth; Kucirkova, Natalia; Jones, Ann; Cross, Simon; Ferguson, Rebecca; Herodotou, Christothea; Hillaire, Garron and Scanlon, Eileen (2017). Dimensions of personalisation in technology-enhanced learning: a framework and implications for design. British Journal of Educational Technology (early view).

Leave a comment

Developing a strategic approach to MOOCs

german-refOur introductory article for the JIME special issue on MOOCs focused on the research work carried out in the area by UK universities who are FutureLearn partners.

‘Developing a strategic approach to MOOCs’ uses the work carried out at these universities to identify nine priority areas for MOOC research and how these can be developed in the future:

  1. Develop a strategic approach to MOOCs.
  2. Expand the benefits of teaching and learning in MOOCs.
  3. Offer well-designed assessment and accreditation.
  4. Widen participation and extend access.
  5. Develop and make effective use of appropriate pedagogies.
  6. Support the development of educators.
  7. Make effective use of learning design.
  8. Develop methods of quality assurance.
  9. Address issues related to privacy and ethics.

Ferguson, Rebecca; Scanlon, Eileen and Harris, Lisa (2016). Developing a strategic approach to MOOCs. Journal of Interactive Media in Education, 2016(1), article no. 21.

Abstract

During the last eight years, interest in massive open online courses (MOOCs) has grown fast and continuously worldwide. Universities that had never engaged with open or online learning have begun to run courses in these new environments. Millions of learners have joined these courses, many of them new to learning at this level. Amid all this learning and teaching activity, researchers have been busy investigating different aspects of this new phenomenon. In this contribution we look at one substantial body of work, publications on MOOCs that were produced at the 29 UK universities connected to the FutureLearn MOOC platform. Bringing these papers together, and considering them as a body of related work, reveals a set of nine priority areas for MOOC research and development. We suggest that these priority areas could be used to develop a strategic approach to learning at scale. We also show how the papers in this special issue align with these priority areas, forming a basis for future work.

, ,

1 Comment

Possibilities and challenges of augmented learning

Teenager's avatar speaking at online conferenceI was invited to write a paper for Distance Education in China, a journal which reaches out to Western academics and is willing to take on the task of translating papers from English. My paper was based on work published in Augmented Education, written by me, Kieron Sheehy and Gill Clough, which was published by Palgrave in 2014.

Abstract

Digital technologies are becoming cheaper, more powerful and more widely used in daily life. At the same time, opportunities are increasing for making use of them to augment learning by extending learners’ interactions with and perceptions of their environment. Augmented learning can make use of augmented reality and virtual reality, as well as a range of technologies that extend human awareness. This paper introduces some of the possibilities opened up by augmented learning and examines one area in which they are currently being employed: the use of virtual realities and tools to augment formal learning. It considers the elements of social presence that are employed when augmenting learning in this way, and discusses different approaches to augmentation.

数字化技术的价格越来越便宜,功能越来越强大,在日常生活中用途越来越广泛。与此同时,利用数字化技术进一步促进学习者与他们所处环境的互动以及对环境的 感知以增强学习的机会也越来越多。增强学习可以利用增强现实和虚拟现实以及许多能提高人类意识的技术。本文介绍增强学习的一些可能性并讨论目前正在应用增 强学习的一个领域:运用虚拟现实和工具增强正式学习。文章分析了基于虚拟现实和工具的增强学习所需的社交临场成分,并讨论不同的增强方法。

Ferguson, Rebecca (2016). 增强学习的可能性与挑战 [Possibilities and challenges of augmented learning]. Distance Education in China, 6 pp. 5–13.

, ,

Leave a comment

Learning analytics and accessibility

LAK accessibilityOn the first main day of the LAK16 conference, Annika Wolff presented a paper on accessibility and learning analytics that we had authored together with Martyn Cooper.

Abstract

This paper explores the potential of analytics for improving accessibility of e-learning and supporting disabled learners in their studies. A comparative analysis of completion rates of disabled and non-disabled students in a large five-year dataset is presented and a wide variation in comparative retention rates is characterized. Learning analytics enable us to identify and understand such discrepancies and, in future, could be used to focus interventions to improve retention of disabled students. An agenda for onward research, focused on Critical Learning Paths, is outlined. This paper is intended to stimulate a wider interest in the potential benefits of learning analytics for institutions as they try to assure the accessibility of their e-learning and provision of support for disabled students.

Cooper, Martyn; Ferguson, Rebecca and Wolff, Annika (2016). What Can Analytics Contribute to Accessibility in e-Learning Systems and to Disabled Students’ Learning? In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.

, ,

Leave a comment

Patterns of engagement across time in MOOCs

Screen Shot 2016-02-19 at 10.23.26New paper out in the Journal of Learning Analytics Research, building on our previous papers dealing with how learners engage with MOOCs.

Abstract

Massive open online courses (MOOCs) are being used across the world to provide millions of learners with access to education. Many who begin these courses complete them successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. Subsequent studies on the FutureLearn platform, which is underpinned by social-constructivist pedagogy, indicate that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy and learning design. This paper reports on two of these studies of learner engagement with FutureLearn courses. Study One first tries, not wholly successfully, to replicate the findings of the Coursera study in a new context. It then uses the same methodological approach to identify patterns of learner engagement on the FutureLearn platform, and indicates how these patterns are influenced by pedagogy and elements of learning design. Study Two investigates whether these patterns of engagement are stable on subsequent presentations of the same courses. Two patterns are found consistently in this and other work: samplers who visit briefly, and completers who fully engage with the course. The paper concludes by exploring the implications for both research and practice.

Ferguson, Rebecca, & Clow, Doug. (2016). Consistent commitment: patterns of engagement across time in massive open online courses (MOOCs). Journal of Learning Analytics, 2(3), 63-88.

, , , ,

Leave a comment