Learning Analytics – full article

Learning analytics is defined as “the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” [1]. “It is about collecting traces that learners leave behind and using those traces to improve learning” [2].

The analysis of education data can be of great value to various groups at universities [3]:

  • Students can view information on their own study habits compared to other students’. They can also measure empirical values and observe their own progress.
  • Lecturers can see how, and how often, their course material is used. For example, they can deduce what material is preferred, and when students take part in which activities.
  • The educational institution gains insight into the use of the educational software purchased.

Learning analytics is described in the current NMC Horizon Report [4] as a tool which increases our understanding of teaching and learning (more on this in [3]). It generates teaching which is better aligned and facilitates more a rapid response to problems.

There are two problems with conducting data analytics at traditional (as opposed to virtual) universities in particular. First, not all student learning activities can be recorded (e.g., study sessions or exercises). Second, even for recorded learning activities it is often difficult or impossible to connect the available data. In addition, caution is always required when data is interpreted. An example: a lecturer sees that 80% of her students download the slides only an hour before the lecture. She therefore decides to make them available only 2 hours before the lecture in future. Here, however, she is only affecting precisely those students who prepare conscientiously and in good time.

A further problem area which is not to be underestimated is data protection, i.e. while tracking student data, which involves the right to privacy. Here the legal situation differs from country to country. In some circumstances evaluation requires the written permission of students. At the very least, transparency should be sought: snooping scandals in the last few months have meant a great increase in mistrust. A good approach is described in [6].

There is still the hope of increasing knowledge significantly. However, there is also a question of how much radical new information big data really provides. Melanie Booth explains this neatly: “Learning analytics may hold great promise as a way to support learning assessment and as a higher education ‘movement.’ The potential of learning analytics to combine information from multiple and disparate sources, to foster more-effective learning conditions in real time, and to enable multiple focal points for analysis and improvement is enticing. However, even though learning analytics offers powerful tools and practices to improve the work of learning and assessment, well-considered principles and propositions for learning assessment should inform its careful adoption and use. Otherwise, learning analytics risks becoming a reductionist approach for measuring a bunch of ‘stuff’ that ultimately doesn’t matter.” [5]

What is decisive is therefore not technical feasibility (much data is easily accessible), but the need to ask the right questions of the data. And many of these questions can already be answered using today’s tools or via didactic research. Here a good example is whether frequent repetition generates better learning.

At ETH analyses of teaching flow into planning in various units. Here we may place comprehensive teaching evaluation first. The results, of course, are only viewable by the affected persons and study programme coordinators. Faculty also evaluate learning data from individual courses. For example, MOOC teaching sequences and learning activities are analysed in Moodle [7]. A superordinate analysis (especially one which combines various sources) at the institutional or course level is not currently performed. This is firstly on grounds of data protection and secondly because the results are generally not worth the effort. However, ETH would benefit from aids such as meaningful data which can be easily obtained from platforms.

Possibilities for students to (for example) compare their study habits with that of their peers are at present restricted to individual courses. A wider offering would be desirable.

Decisions on whether and how big data should be evaluated at the institutional level must be made at the political level.

Keywords: Learning analytics, big data


[1] 1st International Conference on Learning Analytics and Knowledge, Banff, Alberta, February 27–March 1, 2011, as cited in George Siemens and Phil Long, “Penetrating the Fog: Analytics in Learning and Education,” EDUCAUSE Review, vol. 46, no. 5 (September/October 2011).

[2] Erik Duval: Learning Analytics and Educational Data Mining, Erik Duval’s Weblog, 30 January 2012, https://erikduval.wordpress.com/2012/01/30/learning-analytics-and-educational-data-mining/

[3] Surf, the collaborative organisation for ICT in Dutch higher education and research: https://www.surf.nl/en/themes/learning-and-testing/learning-analytics/index.html )

[4] New Media Consortium and EDUCAUSE Learning Initiative, NMC Horizon Report: 2012 Higher Education Edition, p. 22.

[5] Melanie Booth, Learning analytics: The new black: http://www.educause.edu/ero/article/learning-analytics-new-black.

[6] Stephan Göldi: Ist Learning Analytics wirklich neu? http://esomea.goeldi.org/2012/06/08/ist-learning-analytics-wirklich-neu/

[7] http://blogs.ethz.ch/refreshteaching/dates-topics/learning-analytics/

Leave a Reply

Your email address will not be published. Required fields are marked *