Archive for category Chairing and co-chairing

LAK17: it’s a wrap

Scattered between my research presentations at LAK17 was my work as a member of the executive for the Society for Learning Analytics Research (SoLAR). The executive met daily during the conference – it is the only chance we have each year for face-to-face meetings. The LAK conferences also provide a venue for the AGM of the society and, despite the size of the room, where the AGM was held, it was standing room only for most of the meeting.

The executive also have a role to play in decisions about the conference itself, as well as acting as reviewers on the programme committee and chairs for the different sessions. Next year, at LAK18 in Vancouver, I shall be taking on a bigger role, as one of the programme chairs for the conference.

The picture shows me with half the SoLAR Executive at the post-LAK17 review meeting.

SoLAR women

,

Leave a comment

LAK17: doctoral consortium

Screen Shot 2017-03-31 at 09.19.17A very busy week in Vancouver at the LAK17 (learning analytics and knowledge) conference kicked off with the all-day doctoral consortium on 14 March (funded by SoLAR and the NSF). I joined Bodong Chen and Ani Aghababyan as an organiser this year and we enjoyed working with the ten talented doctoral students from across the world who gained a place in the consortium.

  1. Alexander Whitelock-Wainwright: Students’ intentions to use technology in their learning: The effects of internal and external conditions
  2. Alisa Acosta: The design of learning analytics to support a knowledge community and inquiry approach to secondary science
  3. Daniele Di Mitri: Digital learning shadow: digital projection, state estimation and cognitive inference for the learning self
  4. Danielle Hagood: Learning analytics in non-cognitive domains
  5. Justian Knobbout: Designing a learning analytics capabilities model
  6. Leif Nelson: The purpose of higher education in the discourse of learning analytics
  7. Quan Nguyen: Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
  8. Stijn Van Laer: Design guidelines for blended learning environments to support self-regulation: event sequence analysis for investigating learners’ self-regulatory behavior
  9. Tracie Farrell Frey: Seeking relevance: affordances of learning analytics for self-regulated learning
  10. Ye Xiong: Write-and-learn: promoting meaningful learning through concept map-based formative feedback on writing assignments

The intention of the doctoral consortium was to support and inspire doctoral students in their ongoing research efforts. The objectives were to:

  • Provide a setting for mutual feedback on participants’ current research and guidance on future research directions from a mentor panel
  • Create a forum for engaging in dialogue aimed at building capacity in the field with respect to current issues in learning analytics ranging from methods of gathering analytics, interpreting analytics with respect to learning issues, considering ethical issues, relaying the meaning of analytics to impact teaching and learning, etc.
  • Develop a supportive, multidisciplinary community of learning analytics scholars
  • Foster a spirit of collaborative research across countries, institutions and disciplinary background
  • Enhance participating students’ conference experience by connecting participants to other LAK attendees

,

Leave a comment

Learning analytics: visions of the future

Audeince at panel sessionMy final presentation at the LAK16 conference was another session organised by the Learning Analytics Community Exchange (LACE) project that built on our Visions of the Future work. This panel session brought participants together to discuss the next steps for learning analytics and where we are heading as a community.

Abstract

It is important that the LAK community looks to the future, in order that it can help develop the policies, infrastructure and frameworks that will shape its future direction and activity. Taking as its basis the Visions of the Future study carried out by the Learning Analytics Community Exchange (LACE) project, the panelists will present future scenarios and their implications. The session will include time for the audience to discuss both the findings of the study and actions that could be taken by the LAK community in response to these findings.

Ferguson, Rebecca; Brasher, Andrew; Clow, Doug; Griffiths, Dai and Drachsler, Hendrik (2016). Learning Analytics: Visions of the Future. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.

 

, ,

Leave a comment

Failathon: LAK16

Failathon TweetOur second LACE workshop of LAK16 was the highly successful Failathon. The idea for this workshop emerged from an overview of learning analytics evidence provided by the LACE Evidence Hub. This suggested that the published evidence is skewed towards positive results, so we set out to find out whether this is the case.

A packed workshop discussed past failures. All accounts were governed by the Chatham House Rule – they could be reported outside the workshop as long as the source of the information was neither explicitly or implicitly identified.

Abstract

As in many fields, most papers in the learning analytics literature report success or, at least, read as if they are reporting success. This is almost certainly not because learning analytics research and activity are always successful. Generally, we report our successes widely, but keep our failures to ourselves. As Bismarck is alleged to have said: it is wise to learn from the mistakes of others. This workshop offers an opportunity for researchers and practitioners to share their failures in a lower-stakes environment, to help them learn from each other’s mistakes.

Clow, Doug; Ferguson, Rebecca; Macfadyen, Leah and Prinsloo, Paul (2016). LAK Failathon. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.

Leave a comment

Ethics and privacy at LAK16

LACE workshopA busy week at the Learning Analytics and Knowledge 2016 (LAK16) conference began with a workshop on Ethics and Privacy Issues in the Design of Learning Analytics. The workshop formed part of the international EP4LA series run by the LACE project.

The workshop included a series of presentations, and I talked briefly about findings related to ethics and privacy that had emerged from the LACE Visions of the Future study.

Abstract

Issues related to Ethics and Privacy have become a major stumbling block in application of Learning Analytics technologies on a large scale. Recently, the learning analytics community at large has more actively addressed the EP4LA issues, and we are now starting to see learning analytics solutions that are designed not only as an afterthought, but also with these issues in mind. The 2nd EP4LA@LAK16 workshop will bring the discussion on ethics and privacy for learning analytics to a the next level, helping to build an agenda for organizational and technical design of LA solutions, addressing the different processes of a learning analytics workflow.

Drachsler, Hendrik; Hoel, Tore; Cooper, Adam; Kismihók, Gábor; Berg, Alan; Scheffel, Maren; Chen, Weiqin and Ferguson, Rebecca (2016). Ethical and Privacy Issues in the Design of Learning Analytics Applications. In: 6th International Learning Analytics and Knowledge (LAK) Conference, 25-29 April 2016, Edinburgh, Scotland.

, , ,

Leave a comment

LAK16: Practitioner Proceedings

PractitionerTogether with Mike Sharkey (Blackboard) and Negin Mirriahi (University of New South Wales), I chaired the Practitioner Track of the LAK16 conference in Edinburgh and edited the Practitioner Track proceedings.

Practitioners spearhead a significant portion of learning analytics, relying on implementation and experimentation rather than on traditional academic research. The primary goal of the LAK practitioner track is to share thoughts and findings that stem from learning analytics project implementations. The proceedings of the practitioner track from LAK’16 contains 12 short papers that share reports on the piloting and deployment of new and emerging learning analytics tools and initiatives.

Papers accepted in 2016 fell into two categories.

  • Practitioner Presentations Presentation sessions are designed to focus on deployment of a single learning analytics tool or initiative.
  • Technology Showcase The Technology Showcase event enables practitioners to demonstrate new and emerging learning analytics technologies that they are piloting or deploying.

Both types of paper are included in the proceedings.

 

, , ,

Leave a comment

Learning analytics expert workshop: Amsterdam

Tweet about the eventMarch 15-16 2016, I co-ordinated a Learning Analytics Expert Workshop that was jointly run in Amsterdam in March 2016 by the LAEP project and the Learning Analytics Community Exchange (LACE).

Fifty people attended the workshop, including invited experts (expert presentations), representatives of current European-funded projects in the field of learning analytics (project presentations), and representatives of the European Commission.

The workshop dealt with the current state of the art in learning analytics, the prospects for the implementation of learning analytics in the next decade, and the potential for European policy to guide and support the take-up and adaptation of learning analytics to enhance education.

The workshop began with a review of current learning analytics work by participants and went on to consider how learning analytics work can be taken forward in Europe (presentation on the LAEP project).

Participants at the workshop identified immediate issues for learning analytics in Europe. They set out considerations to be taken into account when developing learning analytics, made recommendations for learning analytics work in Europe and then identified both short- and long-term policy priorities in the area.

Immediate issues for LA in Europe

Framework for development: A European roadmap for learning analytics development would help us to build and develop a set of interoperable learning analytics tools that are tailored for the needs of Europe and that have been shown to work in practice.

Stakeholder involvement: There is a need to bring different people and stakeholders on board by reaching out to groups including teachers, students, staff, employers and parents. Our current engagement with stakeholders is too limited.

Data protection and surveillance: As legislation changes and individuals become more aware of data use, institutions need to understand their responsibilities and obligations with regard to data privacy and data protection

Empirical evidence and quality assurance: More empirical evidence is needed about the effects of learning analytics, in order to support a process of quality assurance.

Considerations for the development of LA

  1. Learning analytics can change or reinforce the status quo
  2. Learning analytics should enhance teaching, not replace it
  3. It is our duty to act upon the data we possess
  4. Desirable learning outcomes must be identified
  5. Be clear why we are collecting and analysing data
  6. Bring the data back to the learner
  7. Intelligent systems need human and cultural awareness
  8. Impressive data are not enough

Recommendations for LA work in Europe

  1. Undertake qualitative studies to understand how learning analytics can be aligned with the perceived purpose of education in different contexts, and which aspects of different educational contexts will support or constrain the use of learning analytics.
  2. Publicise existing evaluation frameworks for learning analytics and develop case studies that can be used to enrich and refine these frameworks
  3. Develop forms of quality assurance for learning analytics tools and for the evidence that is shared about these tools.
  4. Identify the limitations of different datasets and analytics and share this information clearly with end users.
  5. Explore ways of combining different datasets to increase the value of learning analytics for learners and teachers.
  6. Extend to different sectors of education the work currently being carried out in the higher education sector to identify the different elements that need to be taken into account when deploying learning analytics.
  7. Develop analytics, and uses for analytics, that delight and empower users.

Short-term policy priorities

Workshop discussion

Workshop discussion

Innovative pedagogy: Top priority is the need for novel, innovative pedagogy that drives innovation and the use of data to solve practical problems.

Evidence hub: Second priority is to secure continuing funding for a site that brings together evidence of what works and what does not in the field of learning analytics.

Data privacy: Participants considered that a clear statement is needed from privacy commissioners about controls to protect learners, teachers and society.

Orchestration of grants: The European grants system could better support the development of learning analytics if grants were orchestrated around an agreed reference model.

Crowd-sourced funding support: Set up a system for crowd-sourcing funding of tools teachers need, with EU top-up funding available for successful candidates.

21st-century skills: Focus on developing learning analytics for important skills and competencies that are difficult to measure, particularly 21st-century skills.

Open access standards: Standards need to be put into practice for analytics across Europe, with an open access forum that will enable the creation of standards from practice.

Ambassadors: We need more outreach, with ministries and politicians spreading the word and encouraging local communities and schools to engage.

Long-term policy priorities

Teacher education: Top priority in the longer term was for media competencies and learning analytics knowledge to be built into training for both new and existing teachers.

Decide which problems we want to solve: In order to develop the field of learning analytics we need to have collective discussions on the directions in which we want to go.

Facilitate data amalgamation: More consideration is needed of how to combine data sources to provide multi-faceted insights into the problems we seek to solve.

Identify success cases and methodologies that give us a solid foundation: We need a coordinated approach to quality assurance and to the identification of successful work.

Several accounts of the workshop are available online, dealing with the morning of day one, the afternoon of day one, day one as a whole, the morning of day two, the afternoon of day two and day two as a whole.

Leave a comment