I don’t engage very heavily with either Research Gate or with academia.edu for several reasons
(1) Time is limited, and there are only so many networks I can engage with
(2) All my work is available via my institutional repository (ORO) or via this blog
(3) Neither Research Gare nor academia.edu seems to be particularly open about its business model. How are they making money out of my time and my resources?
I thought for a while that ResearchGate might be making money via targeted job ads, but they’re currently suggesting I might be interested in the post of associate dean for veterinary research at Ross University, Saint Kitts and Neots. As my only qualifications for that job are that I once had a pet cat and I like visiting tropical beaches, I don’t think their targeting algorithms are very sophisticated.
Despite my overall lack of engagement, both sites now know a fair amount about me and my work, and my co-authors often upload papers. This means I sometimes get email updates on my downloads. This week, apparently, my work was downloaded 101 times, with 72 people downloading a technical report on social learning analytics and 16 downloading an editorial that came out this week. I even get a national breakdown of downloads (see pic). In addition to those shown, my work was accessed from Taiwan (3), Italy (3), Canada (2), Finland (2), South Korea (2), New Zealand (2), Indonesia (1), Romania (1) and Ecuador (1). That’s 20 countries this week.
Meanwhile, back at the institutional repository, my work has been downloaded over 1000 times this month. I’m not sure what to make of this. If these figures are typical (I’ve no idea if they’re high or low), then there is an enormous amount of scholars out there who are doing an enormous amount of reading. And it also looks as if the digital divide is growing – I see no African countries at all on that download list, and this reflects my experience at conferences.
I’m pleased that our paperLearning Analytics in Context: Overcoming the Barriers to Large-Scale Adoption has just been published by the Journal of Learning Analytics.
The paper begins by looking at why introducing learning analytics within an institution often proves to be difficult. It goes on to set out a framework that offers a step-by-step approach to the introduction of learning analytics, and shows how this can work in practice by focusing on developments in two very different institutions: a distance university in the UK and a university of technology in Australia.
The paper’s authors bring together a wealth of experience that is grounded in strategy, research and practice. Co-authors with a strategic perspective are Belinda Tynan, pro vice chancellor at the UK’s Open University who is leading on the development and roll-out of a programme of learning analytics across the institution, and Shirley Alexander, the deputy vice chancellor taking the lead on developing University of Technology Sydney as a data-intensive university. Leah Macfadyen from the University of British Columbia and Shane Dawson from the University of South Australia bring a research perspective that draws on an intensive study of the roll-out of analytics at an institutional scale, while Doug Clow from The Open University draws on his practitioner experience as a data wrangler, as well as his research experience in this area.
A core goal for most learning analytic projects is to move from small-scale research towards broader institutional implementation, but this introduces a new set of challenges because institutions are stable systems, resistant to change. To avoid failure and maximize success, implementation of learning analytics at scale requires explicit and careful consideration of the entire TEL technology complex: the different groups of people involved, the educational beliefs and practices of those groups, the technologies they use and the specific environments within which they operate. It is crucial to provide not only the analytics and their associated tools, but also to begin with a clear strategic vision, to critically assess institutional culture, to identify potential barriers to adoption, to develop approaches to overcome these and to put in place appropriate forms of support, training and community building. In this paper, we provide tools and case studies that will support educational institutions in deploying learning analytics at scale with the goal of achieving specified learning and teaching objectives. The ROMA Framework offers a step-by-step approach to the institutional implementation of learning analytics and this approach is grounded by case studies of practice from the UK and Australia.
The Open University is advertising six Leverhulme doctoral scholarships in open world learning with a closing date for applications of Monday 9 March 2015. These are full-time, fully funded studentships, leading to a PhD.
One of the named topics is ‘Educator roles in open online courses‘ and the description is:
“What roles do educators play in massive open online courses (MOOCs)? How can they be most effective in supporting learners to achieve their learning goals? In these open online settings, teaching is carried out by a team of educators, including academic lead, course presenter, moderator, facilitator and the learners themselves. These roles are still being developed, and there is a pressing need to identify evidence-based good practice. The successful candidate will use data from a range of MOOCs to answer the questions above, and will have opportunities to work with the FutureLearn Academic Network, an international team of MOOC researchers.”
If you are interested in applying, you need to provide a short research proposal explaining how this area fits the overall theme of Open World Learning and how you intend to conduct research on the topic selected. See the website for more specific details about applying.
When putting together an application, you may find it useful to take a look at these two papers: Taking on different roles: how educators position themselves in MOOCs and Innovative pedagogy at massive scale: teaching and learning in MOOCs.
Last weekend, I was with the LACE project team at London’s Excel Centre for the BETT Show. For an enormous show with a substantial web presence, BETT is surprisingly cagey about defining itself or explaining what its name means. Fortunately, Wikipedia comes to the rescue: ‘BETT or The Bett Show (formerly known as the British Educational Training and Technology Show) is an annual Trade show in the United Kingdom that showcases the use of information technology in education.’
I was there not as a speaker but to engage in ‘event amplification’ – taking photos and tweeting about the event.
The LACE project had organised two sessions. Doug Clow talked about Creating a Learning Plan for Learning Analytics in the higher-education-focused section of the LearnLive theatre. This was followed, in the secondary area of LearnLive, by a panel discussion, Learning Analytics: Making Learning Better? Both these events were packed out, with standing room only at the back, and people peering in through the doorways.
We followed these with the LACE Annual Meeting, with participants from across Europe getting together to discuss learning analytics over lunch in a nearby restaurant. The discussion was excellent, but I wouldn’t recommend the restaurant.
According to the latest set of Open Research Online (ORO) figures, I now have 15,391 total downloads. This makes me the 46th most downloaded researcher of the thousands on the OU system.
Checking back in my blog, my work had been downloaded 8,780 times last March. The change in the figures suggests my work is downloaded from ORO 80 times a day on average. This seems surprisingly high, and underlines the benefits of having research easily searchable and downloadable online.
Meanwhile, over on Google Scholar, all this downloading activity translates into 768 citations to date, or one citation for every 20 downloads. That rate has remained steady since March. I’m also surprised at that consistency – I would have expected the rate to vary because the download numbers are so very different.
I’m pleased to see that my thesis has now been downloaded 912 times. Open access makes it so much more easy to open doctoral research up to the world instead of leaving it languishing on the shelves of the author’s family.
Back in 2011, I was part of a group of practitioners and researchers that published a proposal for an open and modularised platform for open learning analytics. In it, we outlined the development of an integrated and extensible toolset that would help academics and organisations to evaluate learner activity, determine needed interventions, and improve advancement of learning opportunities.
Siemens, G., Gašević, D., Haythornthwaite, C., Dawson, S., Buckingham Shum, S., Ferguson, R., Duval, E, Verbert, K, and Baker, R. S. J. d. (2011). Open Learning Analytics: An Integrated and Modularized Platform (Concept Paper): SOLAR.
We moved forward on this idea in spring this year when, following the LAK14 conference, I was invited to spend a weekend on the outskirts of Indianapolis, at the Open Learning Analytics (OLA) summit. One outcome of that event was the identification of domains in which future work may be conducted: open research, institutional strategy and policy issues, and learning sciences/learning design and open standards/open-source software – and ethical issues related to all of these.
At the start of December, there was another meet-up, this time in Europe and organised by the LACE project, together with the Apereo Foundation and the University of Amsterdam. In a room littered with classical sculpture, at Amsterdam’s Allard Pierson Museum, participants from across Europe gave presentations on their interests in, and vision for, Open Learning Analytics and its application to education or training.
- Niall Sclater, from JISC in the UK, talked about developing the infrastructure for basic learning analytics systems.
- Thieme Hemmis, from DelftX, talked about an open research framework – a standard instrument for collaborative research that can give access to data, people and instruments.
- Erwin Bomas, from Kennisnet and the LACE project is working on a system to gain informed consent for access to data
- Wolfgang Müller, University of Education Weingarten, is focusing on providing analytics to teachers to they can provide better formative feedback. The aim is to provide informative data on learning processes.
- Patrick Lynch, University of Hull, introduced Apereo’s work on connecting up data, analytics and data presentation.
- Vladimer Kobayashi is researching ways of linking the skills of graduates with the labour market.
- Adam Cooper is leading a strand of the LACE project that is focused on interoperability and data sharing.
- Neils Smits, VU university Amsterdam, usesg cluster analysis to understand student activity on the Blackboard virtual learning environment. This analysis accounts for 50% of variance in later exam performance.
- Alan Berg, of the Apereo Foundation, talked about the grand challenge of learning analytics.
Following these presentations, we brainstormed ideas for action, exploring objectives for collaborative projects that could be achievable in 2-4 years, the methods to achieve these objectives, and the nature of an Open Learning Analytics Framework as a means of coordinating action. A next step will be to work together on bids to Europe’s Horizon 2020 funding programme in order to make these ideas into reality.