On 15 April, the LACE project held a one-day briefing and workshop in Brussels on Policies for Educational Data Mining and Learning Analytics. Originally planned to take place in the European Parliament, a security alert required a move to the nearby Thon Hotel.
The day began with a welcome from Julie Ward, MEP for the North West of England and member of the Culture and Education Committee. She was followed by Robert Madelin (Director-General of DG Connect) and Dragan Gašević (president-elect of SoLAR). Their talks were followed by overviews of the current European-funded learning analytics projects: LACE, Lea’s Box, PELARS and WatchMe.
During the afternoon discussion and review session, participants from across Europe worked together in three separate discussion groups to review specific issues related to the use of learning analytics in schools, universities and workplace training.
I worked as rapporteur in the universities workshop (pictured), which had 186 participants, including people from England, Estonia, Germany, the Netherlands, Norway, Scotland and Sweden. Our policy recommendations included:
- Privacy and ethical issues are important. Encourage institutions to develop policies covering privacy, ethics and data protection. However, this is a broader issue than educational policy making and legislation. We should aim to influence the wider debate.
- Guard against data degradation – develop and make available methods of retaining data over time
- Develop data standards and encourage their use so that we have standardisation of data
- Address the problem of over-claiming and mis-selling by vendors – institutions do not necessarily have access to the expertise that allow them to interpret and assess these claims
- Need to identify procedure for due diligence around intervention strategies, the competencies do staff need, and certification opportunities relating to these
- Identify requirements for data collection, and structures for doing this on a sector or national basis
- Support the development of standard datasets at national or international level, against which other data can be compared to see if performance is above or below the norm
- Identify behaviours in the field of education that regional or national governments should support and encourage
- Identify ways of preventing the providers of educational tools selling our own data back to us.
- Take into account that it is not just the data we are concerned about, because once it is removed from its context it does not necessarily make sense. Data needs to be associated with metadata that is produced using standardised conventions
This special issue, edited by Yishay Mor, Barbara Wasson and myself, developed from an Alpine Rendezvous workshop we ran in 2013 that dealt with the connections between learning design, learning analytics and teacher inquiry.
This special issue deals with three areas. Learning design is the practice of devising effective learning experiences aimed at achieving defined educational objectives in a given context. Teacher inquiry is an approach to professional development and capacity building in education in which teachers study their own and their peers’ practice. Learning analytics use data about learners and their contexts to understand and optimise learning and the environments in which it takes place. Typically, these three—design, inquiry and analytics—are seen as separate areas of practice and research. In this issue, we show that the three can work together to form a virtuous circle. Within this circle, learning analytics offers a powerful set of tools for teacher inquiry, feeding back into improved learning design. Learning design provides a semantic structure for analytics, whereas teacher inquiry defines meaningful questions to analyse.
BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY
VOL 46; NUMB 2 (2015)
Editorial: Learning design, teacher inquiry into student learning and learning analytics: A call for action
Mor, Y.; Ferguson, R.; Wasson, B.
Informing learning design with learning analytics to improve teacher inquiry
Persico, D.; Pozzi, F.
A method for teacher inquiry in cross-curricular projects: Lessons from a case study
Avramides, K.; Hunter, J.; Oliver, M.; Luckin, R.
Supporting teachers in data-informed educational design
McKenney, S.; Mor, Y.
Forward-oriented designing for learning as a means to achieve educational quality
Ghislandi, P. M.; Raffaghelli, J. E.
Analysing content and patterns of interaction for improving the learning design of networked learning environments
Haya, P. A.; Daems, O.; Malzahn, N.; Castellanos, J.; Hoppe, H. U.
How was the activity? A visualization support for a case of location-based learning design
Melero, J.; Hernndez-Leo, D.; Sun, J.; Santos, P.; Blat, J.
Scripting and monitoring meet each other: Aligning learning analytics and learning design to support teachers in orchestrating CSCL situations
Rodrguez-Triana, M. J.; Martnez-Mons, A.; Asensio-Prez, J. I.; Dimitriadis, Y.
Mor, Yishay, Ferguson, Rebecca, & Wasson, Barbara. (2015). Editorial: learning design, teacher inquiry into student learning and learning analytics: a call for action. British Journal of Educational Technology, 46(2), 221-229.
As part of the Learning Analytics Community Exchange (LACE) project’s engagement with LAK15, we brought participants from across Europe together to talk about European perspectives on learning analytics.
Alejandra Martínez Monés from Spain talked about past work carried out as part of the European Kaleidoscope Network of Excellence that has implications for the development of learning analytics internationally. Alan Berg from The Netherlands provided links to a series of initiatives designed to bring researchers and practitioners together across national boundaries. Kairit Tammets introduced learning analytics work in Estonia, and Anne Boyer offered a French perspective. Members of the LACE project talked about their work to pull together research, practice and evidence across Europe.
Ferguson, Rebecca; Cooper, Adam; Drachsler, Hendrik; Kismihók, Gábor; Boyer, Anne; Tammets, Kairit, & Martínez Monés, Alejandra. (2015). Learning Analytics: European Perspectives. Paper presented at LAK16, Poughkeepsie, NY, USA.
Since the emergence of learning analytics in North America, researchers and practitioners have worked to develop an international community. The organization of events such as SoLAR Flares and LASI Locals, as well as the move of LAK in 2013 from North America to Europe, has supported this aim. There are now thriving learning analytics groups in North American, Europe and Australia, with smaller pockets of activity emerging on other continents. Nevertheless, much of the work carried out outside these forums, or published in languages other than English, is still inaccessible to most people in the community. This panel, organized by Europe’s Learning Analytics Community Exchange (LACE) project, brings together researchers from five European countries to examine the field from European perspectives. In doing so, it will identify the benefits and challenges associated with sharing and developing practice across national boundaries.
LACE project involvement in LAK15 also included a workshop on Ethical and Privacy Issues in the Application of Learning Analytics.
My main paper at LAK15 analysed engagement patterns in FutureLearn MOOCs. In it, Doug Clow and I began by carrying out a replication study, building on an earlier study of Coursera MOOCs by Kizilcec and his colleagues. Although our cluster analysis found two clusters that were very similar to those found in the earlier study, our other clusters did not match theirs. The different clusters of learners on the two platforms appeared to relate to the pedagogy (approach to learning and teaching) underlying the courses.
Ferguson, Rebecca, & Clow, Doug. (2015). Examining engagement: analysing learner subpopulations in massive open online courses (MOOCs). Paper presented at LAK 15 (March 16-20), Poughkeepsie, USA.
Massive open online courses (MOOCs) are now being used across the world to provide millions of learners with access to education. Many learners complete these courses successfully, or to their own satisfaction, but the high numbers who do not finish remain a subject of concern for platform providers and educators. In 2013, a team from Stanford University analysed engagement patterns on three MOOCs run on the Coursera platform. They found four distinct patterns of engagement that emerged from MOOCs based on videos and assessments. However, not all platforms take this approach to learning design. Courses on the FutureLearn platform are underpinned by a social-constructivist pedagogy, which includes discussion as an important element. In this paper, we analyse engagement patterns on four FutureLearn MOOCs and find that only two clusters identified previously apply in this case. Instead, we see seven distinct patterns of engagement: Samplers, Strong Starters, Returners, Mid-way Dropouts, Nearly There, Late Completers and Keen Completers. This suggests that patterns of engagement in these massive learning environments are influenced by decisions about pedagogy. We also make some observations about approaches to clustering in this context.