Archive for category Funding
A very busy week in Vancouver at the LAK17 (learning analytics and knowledge) conference kicked off with the all-day doctoral consortium on 14 March (funded by SoLAR and the NSF). I joined Bodong Chen and Ani Aghababyan as an organiser this year and we enjoyed working with the ten talented doctoral students from across the world who gained a place in the consortium.
- Alexander Whitelock-Wainwright: Students’ intentions to use technology in their learning: The effects of internal and external conditions
- Alisa Acosta: The design of learning analytics to support a knowledge community and inquiry approach to secondary science
- Daniele Di Mitri: Digital learning shadow: digital projection, state estimation and cognitive inference for the learning self
- Danielle Hagood: Learning analytics in non-cognitive domains
- Justian Knobbout: Designing a learning analytics capabilities model
- Leif Nelson: The purpose of higher education in the discourse of learning analytics
- Quan Nguyen: Unravelling the dynamics of learning design within and between disciplines in higher education using learning analytics
- Stijn Van Laer: Design guidelines for blended learning environments to support self-regulation: event sequence analysis for investigating learners’ self-regulatory behavior
- Tracie Farrell Frey: Seeking relevance: affordances of learning analytics for self-regulated learning
- Ye Xiong: Write-and-learn: promoting meaningful learning through concept map-based formative feedback on writing assignments
The intention of the doctoral consortium was to support and inspire doctoral students in their ongoing research efforts. The objectives were to:
- Provide a setting for mutual feedback on participants’ current research and guidance on future research directions from a mentor panel
- Create a forum for engaging in dialogue aimed at building capacity in the field with respect to current issues in learning analytics ranging from methods of gathering analytics, interpreting analytics with respect to learning issues, considering ethical issues, relaying the meaning of analytics to impact teaching and learning, etc.
- Develop a supportive, multidisciplinary community of learning analytics scholars
- Foster a spirit of collaborative research across countries, institutions and disciplinary background
- Enhance participating students’ conference experience by connecting participants to other LAK attendees
Research Evidence on the Use of Learning Analytics: Implications for Education Policy brings together the findings of a literature review; case studies; an inventory of tools, policies and practices; and an expert workshop.
The report also provides an Action List for policymakers, practitioners, researchers and industry members to guide work in Europe.
Learning Analytics: Action List
Policy leadership and governance practices
- Develop common visions of learning analytics that address strategic objectives and priorities
- Develop a roadmap for learning analytics within Europe
- Align learning analytics work with different sectors of education
- Develop frameworks that enable the development of analytics
- Assign responsibility for the development of learning analytics within Europe
- Continuously work on reaching common understanding and developing new priorities
Institutional leadership and governance practices
- Create organisational structures to support the use of learning analytics and help educational leaders to implement these changes
- Develop practices that are appropriate to different contexts
- Develop and employ ethical standards, including data protection
Collaboration and networking
- Identify and build on work in related areas and other countries
- Engage stakeholders throughout the process to create learning analytics that have useful features
- Support collaboration with commercial organisations
Teaching and learning practices
- Develop learning analytics that makes good use of pedagogy
- Align analytics with assessment practices
Quality assessment and assurance practices
- Develop a robust quality assurance process to ensure the validity and reliability of tools
- Develop evaluation checklists for learning analytics tools
- Identify the skills required in different areas
- Train and support researchers and developers to work in this field
- Train and support educators to use analytics to support achievement
- Develop technologies that enable development of analytics
- Adapt and employ interoperability standards
Other resources related to the LAEP project – including the LAEP Inventory of learning analytics tools, policies and practices – are available on Cloudworks.
The Open University is advertising six Leverhulme doctoral scholarships in open world learning with a closing date for applications of Monday 9 March 2015. These are full-time, fully funded studentships, leading to a PhD.
One of the named topics is ‘Educator roles in open online courses‘ and the description is:
“What roles do educators play in massive open online courses (MOOCs)? How can they be most effective in supporting learners to achieve their learning goals? In these open online settings, teaching is carried out by a team of educators, including academic lead, course presenter, moderator, facilitator and the learners themselves. These roles are still being developed, and there is a pressing need to identify evidence-based good practice. The successful candidate will use data from a range of MOOCs to answer the questions above, and will have opportunities to work with the FutureLearn Academic Network, an international team of MOOC researchers.”
If you are interested in applying, you need to provide a short research proposal explaining how this area fits the overall theme of Open World Learning and how you intend to conduct research on the topic selected. See the website for more specific details about applying.
When putting together an application, you may find it useful to take a look at these two papers: Taking on different roles: how educators position themselves in MOOCs and Innovative pedagogy at massive scale: teaching and learning in MOOCs.
On 16-17 September, I was in Graz with the Learning Analytics Community Exchange (LACE) . Before our consortium meeting, we held the 1st Learning Analytics Data Sharing Workshop. This brought people together from across Europe to discuss possibilities for data sharing.
The workshop was designed to act as a bridge between research and practical action. It also dealt with the technical, operational, business, policy and governance challenges involved with data sharing – with a particular focus on privacy issues.
The workshop was followed by a consortium meeting, and plans for developing this Europe-wide learning analytics community further.
From 20-22 January, I was in Brussels for the kick-off meeting of the Learning Analytics Community Exchange (LACE).
The LACE project brings together existing key European players in the field of learning analytics and educational data mining (EDM), who are committed to build communities of practice and share emerging best practice in order to make progress towards four objectives:
1. Promote knowledge creation and exchange
2. Increase the evidence base
3. Contribute to the definition of future directions
4. Build consensus on interoperability and data sharing
This will involve organising a range of activities designed to integrate people carrying out or making use of learning analytics and ED research and development. LACE will also develop an ‘evidence hub’ that will bring together a knowledge base of evidence in the field. Members will also explore plausible futures for the field.
Open Universiteit Nederland, Netherlands
Cetis, the Centre for Educational Technology and Interoperability Standards at the University of Bolton, UK
The Open University, UK
Infinity Technology Solutions, Italy
Skolverket, the Swedish National Agency for Education, Sweden
Høgskolen i Oslo og Akershus, Norway
ATiT, Audiovisual Technologies, Informatics and Telecommunications, Belgium
EDEN, the European Distance Education Network, Hungary
While at the Learning Analytics Summer Institute at Stanford, I participated in a panel on Analytics for 21st-century Skills. The panel was chaired by Caroline Haythornthwaite and my fellow panellists were Ruth Deakin-Crick (University of Bristol) and Peter Foltz (Pearson).
My section of the panel focused on our work with EnquiryBlogger. This tool, built on the WordPress blogging platform, can be used to help structure knowledge construction, and to reflect on the emotions and dispositions that form part of the learning process.