1000 citations and counting

1002 citations according to Google Scholar, and  20,548 downloads from the university’s Open Research Online repository (ORO).
citation count - 1002 citations

, ,

Leave a comment

LACE Spring Briefing

Spring briefing workshopOn 15 April, the LACE project held a one-day briefing and workshop in Brussels on Policies for Educational Data Mining and Learning Analytics. Originally planned to take place in the European Parliament, a security alert required a move to the nearby Thon Hotel.

The day began with a welcome from Julie Ward, MEP for the North West of England and member of the Culture and Education Committee. She was followed by  Robert Madelin (Director-General of DG Connect) and Dragan Gašević (president-elect of SoLAR). Their talks were followed by overviews of the current European-funded learning analytics projects: LACE, Lea’s Box, PELARS and WatchMe.

During the afternoon discussion and review session, participants from across Europe worked together in three separate discussion groups  to review specific issues related to the use of learning analytics in schools, universities and workplace training.

I worked as rapporteur in the universities workshop (pictured), which had 186 participants, including people from England, Estonia, Germany, the Netherlands, Norway, Scotland and Sweden. Our policy recommendations included:

  • Privacy and ethical issues are important. Encourage institutions to develop policies covering privacy, ethics and data protection. However, this is a broader issue than educational policy making and legislation. We should aim to influence the wider debate.
  • Guard against data degradation – develop and make available methods of retaining data over time
  • Develop data standards and encourage their use so that we have standardisation of data
  • Address the problem of over-claiming and mis-selling by vendors – institutions do not necessarily have access to the expertise that allow them to interpret and assess these claims
  • Need to identify procedure for due diligence around intervention strategies, the competencies do staff need, and certification opportunities relating to these
  • Identify requirements for data collection, and structures for doing this on a sector or national basis
  • Support the development of standard datasets at national or international level, against which other data can be compared to see if performance is above or below the norm
  • Identify behaviours in the field of education that regional or national governments should support and encourage
  • Identify ways of preventing the providers of educational tools selling our own data back to us.
  • Take into account that it is not just the data we are concerned about, because once it is removed from its context it does not necessarily make sense. Data needs to be associated with metadata that is produced using standardised conventions

 

, , , , , , , ,

Leave a comment

Teacher-led inquiry and learning design: BJET special issue

virtuous circle of learning design, learning analytics and teacher inquiryIn mid March, the British Journal of Educational Technology (BJET) published our special issue on learning design, learning analytics and teacher inquiry.

This special issue, edited by Yishay Mor, Barbara Wasson and myself, developed from an Alpine Rendezvous workshop we ran in 2013 that dealt with the connections between learning design, learning analytics and teacher inquiry.

This special issue deals with three areas. Learning design is the practice of devising effective learning experiences aimed at achieving defined educational objectives in a given context. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers’ practice. Learning analytics use data about learners and their contexts to understand and optimise learning and the environments in which it takes place. Typically, these three—design, inquiry and analytics—are seen as separate areas of practice and research. In this issue, we show that the three can work together to form a virtuous circle. Within this circle, learning analytics offers a powerful set of tools for teacher inquiry, feeding back into improved learning design. Learning design provides a semantic structure for analytics, whereas teacher inquiry defines meaningful questions to analyse.

Contents

BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY

VOL 46; NUMB 2 (2015)
ISSN 0007-1013

Mor, Yishay, Ferguson, Rebecca, & Wasson, Barbara. (2015). Editorial: learning design, teacher inquiry into student learning and learning analytics: a call for action. British Journal of Educational Technology, 46(2), 221-229.

, ,

Leave a comment

Rhetorical analysis and tutors’ grades

Duygu presenting at LAK16One of my doctoral students, Duygu Simsek (now Duygu Bektik), presented on her work at LAK15.

Simsek, Duygu; Sandor, Ágnes; Buckingham Shum, Simon; Ferguson, Rebecca; De Liddo, Anna and Whitelock, Denise (2015). Correlations between automated rhetorical analysis and tutors’ grades on student essays. In: 5th International Learning Analytics & Knowledge Conference (LAK15), 16-20 March 2015, Poughkeepsie, NY, USA, ACM.

When assessing student essays, educators look for the students’ ability to present and pursue well-reasoned and strong arguments. Such scholarly argumentation is often articulated by rhetorical metadiscourse. Educators will be necessarily examining metadiscourse in students’ writing as signals of the intellectual moves that make their reasoning visible. Therefore students and educators could benefit from available powerful automated textual analysis that is able to detect rhetorical metadiscourse. However, there is a need to validate such technologies in higher education contexts, since they were originally developed in non-educational applications. This paper describes an evaluation study of a particular language analysis tool, the Xerox Incremental Parser (XIP), on undergraduate social science student essays, using the mark awarded as a measure of the quality of the writing. As part of this exploration, the study presented in this paper seeks to assess the quality of the XIP through correlational studies and multiple regression analysis.

Duygu’s slides

, ,

Leave a comment

Twitter stream

Always good to have a presentation tweeted by your pro vice chancellor :-)
Tweet by Belinda Tynan

, ,

Leave a comment

European perspectives on learning analytics

Presentation by Adam Cooper as part of LACE panelAs part of the Learning Analytics Community Exchange (LACE) project’s engagement with LAK15, we brought participants from across Europe together to talk about European perspectives on learning analytics.

Alejandra Martínez Monés from Spain talked about past work carried out as part of the European Kaleidoscope Network of Excellence that has implications for the development of learning analytics internationally. Alan Berg from The Netherlands provided links to a series of initiatives designed to bring researchers and practitioners together across national boundaries. Kairit Tammets introduced learning analytics work in Estonia, and Anne Boyer offered a French perspective. Members of the LACE project talked about their work to pull together research, practice and evidence across Europe.

Ferguson, Rebecca; Cooper, Adam; Drachsler, Hendrik; Kismihók, Gábor; Boyer, Anne; Tammets, Kairit, & Martínez Monés, Alejandra. (2015). Learning Analytics: European Perspectives. Paper presented at LAK16, Poughkeepsie, NY, USA.

Since the emergence of learning analytics in North America, researchers and practitioners have worked to develop an international community. The organization of events such as SoLAR Flares and LASI Locals, as well as the move of LAK in 2013 from North America to Europe, has supported this aim. There are now thriving learning analytics groups in North American, Europe and Australia, with smaller pockets of activity emerging on other continents. Nevertheless, much of the work carried out outside these forums, or published in languages other than English, is still inaccessible to most people in the community. This panel, organized by Europe’s Learning Analytics Community Exchange (LACE) project, brings together researchers from five European countries to examine the field from European perspectives. In doing so, it will identify the benefits and challenges associated with sharing and developing practice across national boundaries.

LACE project involvement in LAK15 also included a workshop on Ethical and Privacy Issues in the Application of Learning Analytics.

, , , , , , ,

Leave a comment

Examining engagement in MOOCs

My main paper at LAK15 analysed engagement patterns in FutureLearn MOOCs. In it, Doug Clow and I began by carrying out a replication study, building on an earlier study of Coursera MOOCs by Kizilcec and his colleagues. Although our cluster analysis found two clusters that were very similar to those found in the earlier study, our other clusters did not match theirs. The different clusters of learners on the two platforms appeared to relate to the pedagogy (approach to learning and teaching) underlying the courses.

Ferguson, Rebecca, & Clow, Doug. (2015). Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs). Paper presented at LAK 15 (March 16-20), Poughkeepsie, USA.

Abstract

Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.

 

, , , ,

Leave a comment