Archive for category Analytics

eTwinning: Seville

JRCI am just back from an expert workshop held at the European Commission’s Joint Research Centre (JRC) in Seville.

The EU has a very large database, covering 12 years, related to a European-wide project called etwinning. This project puts teachers in touch with each other across Europe so that they can share ideas and innovation, develop their professional and digital skills and, specifically, join together to develop and carry out projects involving their pupils. The database covers activity and interactions on that platform by many thousands of individual teachers.

The JRC is interested in using this dataset to generate actionable insights that can help teachers and learners across Europe. The expert workshop brought together researchers from across Europe to discuss different ways of doing this. The participants brought many different perspectives to the event – some had worked with the platform for years, some came from Ministries of Education, others had explored large educational datasets in the past or had organised large studies.

Together, we identified different questions that the database could help to answer, and discussed ways in which it could be related to external data sources.

Advertisements

,

Leave a comment

PhD examining: open learner models

University of Birmingham railway stationOn 26 October, I was at the University of Birmingham for the viva of Matthew Johnson. His focus was on ‘The Impact of Technology on Metacognition in Computer-mediated Learning’ and, more specifically, on the use of open learner models.

For those who haven’t encountered open learner models before, they begin with a domain model. This takes a subject area and sets out which knowledge underpins other knowledge. For example, in primary-school mathematics, pupils will struggle to understand multiplication if they haven’t first understood addition, and they will struggle to understand addition if they haven’t first understood number. Multiple-choice tests can be used to assess where a learner is in terms of the domain model. The result of these tests is a learner model, which can be used to make automated decisions about which subject knowledge a student should cover next. An open learner model exposes the logic behind this model to the learner. For example, a learner might wonder why they have been give work to do on simple multiplication, and they could explore the open learner model to find out it was because they had got three specific test questions wrong. This would provide a basis for reflection on their learning and on the subject area, and could also give an opportunity to challenge the learner model.

A thesis abstract remains a work in progress until the final version is printed and agreed, but this paragraph from it gives a sense of what Matthew has been working on:

The thesis finds it is possible to measure metacognition using indirect methods that correspond to post-hoc learner accounts, and that technology does not influence metacognition for all learners. Evidence supports claims that: technology can support elements of interaction important to the regulation of cognition; significant themes of metacognition transfer to OLMs; defining a profile for those identifying as stronger self-assessors is possible; and that OLMs remain relevant in metacognition research.

Matthew was originaly supervised by Susan Bull and later by Chris Baber.

, ,

Leave a comment

Nordic LASI

Tweet from Nordic LASII visited Bergen in Norway at the end of September to keynote at Nordic LASI. This is one of a series of learning analytics summer institutes run around the world in conjunction with the Society for Learning Analytic Research (SoLAR). The event was well attended, with participants from Russia, Norway, Denmark and Sweden.

Abstract

Learning analytics involve the measurement, collection, analysis and reporting of data about learners and their contexts, in order to understand and optimise learning and the environments in which it occurs. Since emerging as a distinct field in 2011, learning analytics has grown rapidly, and institutions around the world are already developing and deploying these new tools. However, it is not enough for us to develop analytics for our educational systems as they are now – we need to take into account how teaching and learning will take place in the future. The current fast pace of change means that if, in 2007, we had begun developing learning analytics for 2017, we might not have planned specifically for learning with and through social networks (Twitter was only a year old), with smartphones (the first iPhone was released in 2007), or learning at scale (the term MOOC was coined in 2008). By thinking ahead and by consulting with experts, though, we might have come pretty close by taking into account existing work on networked learning, mobile learning and connectivism. This talk will examine ways in which learning analytics could develop in the future, highlighting issues that need to be taken into account. In particular, the learning analytics community needs to work together in order to develop a strong evidence base grounded in both research and practice.

Leave a comment

Centre for the Science of Learning & Technology (SLATE)

Last week, I visited the beautiful town of Bergen to visit the SLATE Centre at the university there. SLATE is a global research centre, designed for the advancement of the learning sciences. Its mission is to advance the frontiers of the science of learning and technology through integrated research. I was able to meet many of the team and talk to them about their research.

While at SLATE, I gave a talk about developing a Vision and an Action Plan for learning analytics – and for other educational innovations. SLATE is well placed to make a difference both nationally and internationally, so their vision has the potential to affect tens of thousands of learnrs in different countries.

Here is SLATE’s account of my talk.

Abstract

The promise of learning analytics is that they will enable us to understand and optimize learning and the environments in which it takes place. The intention is to develop models, algorithms, and processes that can be widely used. In order to do this, we need to help people to move from small-scale initiatives towards large-scale implementation. This is a tough challenge, because educational institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires careful consideration of the entire ‘TEL technology complex’. This complex includes the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use, and the specific environments within which they operate. Providing reliable and trustworthy analytics is just one part of implementing analytics at scale. It is also important to develop a clear strategic vision, assess institutional culture critically, identify potential barriers to adoption, develop approaches that can overcome these, and put in place appropriate forms of support, training, and community building.

,

Leave a comment

Dr Scheffel: examining Maren’s viva

Yesterday I was at the Open University of the Netherlands (OUNL), in Heerlen, as one of the viva examiners for Maren Scheffel. Maren wrote an excellent thesis, The Evaluation Framework for Learning Analytics, gave a strong defence and was awarded her doctorate.

Maren viva

As may be obvious from the picture, vivas in the Netherlands aren’t exactly the same as vivas in the UK. For one thing, the team wear gowns, caps and a shirt front that makes them look as if they have strayed from a painting on the walls of the Rijksmuseum or maybe Hogwarts. Well, not the entire team. You have to have attained professorial status to wear the extremely warm clothing. The reason I look photoshopped in is that, as a lowly doctor, I had to wear normal clothing.

Another difference is the size of the Doctoral Board. In the picture, from left to right, are Professor Delgado Kloos, Professor Griffiths, Professor Drachsler (supervisor), Professor Kalz, (newly declared) Dr Scheffel, Professor Specht (supervisor), me, Professor Brand-Gruwel, and Professor Boshuizen (chair – indicated by the chain around her neck). That’s two internal examiners and three external examiners, two from the UK and one from Spain. For a more informal take on the Board, I have linked all their official titles to their Twitter handles.

The viva takes place in public, in front of family, friends and fellow academics. It is also live-streamed as it takes place, and a recording is presented to the candidate afterwards on a USB stick. As well as the defence, the viva begins with a short presentation by the candidate on her work.

The decision is made there and then. No stringing it out for months of corrections and bureaucracy as in the UK. There is a clear point for celebration. The announcement is made, the signed certificate is formally handed over, the candidate is formally addressed as doctor for the first time, and then it is time for happiness, congratulations and a reception.

This also means that the candidate can ceremonially be sworn in. The main supervisor says:

By virtue of the powers vested in us by Dutch law, in accordance with the decision of the Doctorate Board, I confer on you, Maren Scheffel, the title of doctor and all the rights and all duties to science and society associated by Dutch law or custom to a PhD degree at the Open University of the Netherlands. Do you promise to work in accordance with the principles of academic integrity at all times, to be careful and honest, critical and transparent, independent and impartial?

I like this formal indication that the award of doctor is not just an honour – it is associated with responsibilities and with standards of behaviour.

I also like the appearance of the thesis as a formal document. It doesn’t appear as a large, unwieldy hardback tome, bound at the student’s expense, as it does in the UK. Instead, it is an attractive paperback book, available in advance of the viva. A book you would want to read, rather than a decorative item to sit on a shelf.

Of course, to be available in print before the viva, the thesis must already be done and dusted. While I like all the differences between the UK and Dutch procedure that I have mentioned above, this one seems strange. I’m used to the examiners having some influence on the thesis. The Dutch system is more akin to our PhD by publication. Most elements of it have already appeared in peer-reviewed journals, and the thesis links and supplements these in a coherent manuscript, which is checked by the supervisors. So the work of assessment is done by the peer reviewers, without their awareness, and by the supervisors. The Doctoral Board and the viva serve to validate a decision that has already been made. The examiners’ first job is to decide whether the thesis, as presented, is ready for submission. There is no option to suggest corrections or amendments – it is either ready to go or it isn’t. If it is, then the viva is largely a formality. There is a formal meeting after the defence, but the situation would have to be very extreme for the doctorate not to be granted at that point.

Another aspect that seems strange from the point of view of a UK academic, is how the defence takes place. In the UK, this takes as long as it takes. An hour, two, maybe even three. Yesterday, the time was defined in advance. The defence was to begin at 1.45pm. At 2.30pm the beadle (also in a gown) comes to the front of the room, pounds the ceremonial mace on the floor and declares ‘hora est!’ The candidate can finish a sentence at that point, but otherwise that is it, the defence is over. With five examiners, that means nine minutes of questions each, asking one each in strict rotation. That meant some of us asked two questions, some only one. When you’ve travelled for eight hours to be there, that means thinking very carefully about which single question will make the journey worthwhile.

And did I mention that the event takes place in English (except for a brief foray into Latin by the beadle)? In day-to-day life, Maren speaks German or Dutch, so she was not only demonstrating her academic prowess and  her ability to think on her feet but also her language skills.

If I were coming up with a viva system, it’s not quite how I’d do it (I would prefer to see some amendment of the thesis in the light of the examiners’ feedback), but I do feel that many aspects of the Dutch system are an improvement on our current approach in the UK.

 

1 Comment

Learning analytics – how not to fail

Screen Shot 2017-08-03 at 11.07.44At the LAK17 conference, a group of us held a Failathon workshop and brought its findings to the main conference as a poster. We asked conference-goers to help us to identify ways to avoid failure, and they responded enthusiastically with comments and conversation and sticky notes.

Back at The Open University, Doug Clow and I carried out a lightweight analysis of all the contributions, investigating how experts from around the world proposed to avoid failure.

We pulled the findings together into an article published in Educause Review on 31 July: Learning analytics – avoiding failure.

The article is full of suggestions, but the headline news is presented at the beginning: ‘In order not to fail, it is necessary to have a clear vision of what you want to achieve with learning analytics, a vision that closely aligns with institutional priorities. ‘

Leave a comment

Vital learning analytics

On 17 July, I presented at ‘Analytics in learning and teaching: the role of big data, personalized learning and the future of the teacher’.

This event was held at the University of Central Lancashire (UCLAN) in Preson, and was organised by the VITAL project (Visualisation tools and analytics to monitor language learning and teaching).

My talk was on ‘Learning analytics: planning for the future’.

Abstract

What does the future hold for learning analytics? In terms of Europe’s current priorities for education and training, they will need to support relevant and high-quality knowledge, skills and competences developed throughout lifelong learning. More specifically, they should help improve the quality and efficiency of education and training, enhance creativity and innovation, and focus on learning outcomes in areas such as linguistic abilities, cultural awareness and active-citizenship. This is a challenging agenda that requires us to look beyond our immediate priorities and institutional goals. In order to address this agenda, we need to consider how our work fits into the larger picture. Drawing on the outcomes of two recent European studies, Rebecca will discuss how we can develop an action plan that will drive the development of analytics that enhance both learning and teaching.

Leave a comment