Cookies on this website
We use cookies to ensure that we give you the best experience on our website. If you click 'Continue' we'll assume that you are happy to receive all cookies and you won't see this message again. Click 'Find out more' for information on how to change your cookie settings.

Background: This paper presents learner perceptions of a pilot Massive Open Online Course (MOOC). Objective: The aim of this study was to explore data collection approaches to help inform future MOOC evaluations on the use of semi-structured interviews and the Kirkpatrick evaluation model. Methods: 191 learners joined two course runs of a limited trial of the MOOC. Seven learners volunteered to be interviewed for the study. The study design drew on semi-structured interviews of 2 learners transcribed and analysed using Braun and Clark's method for thematic coding. This limited participant set was used to identify how the Kirkpatrick evaluation model could be used to evaluate further implementations of the course at scale. Results: The study identified several themes that could be used for further analysis. The themes and sub-themes include: Learner background (educational, professional, topic significance), MOOC learning (learning achievement, MOOC application) and MOOC features (MOOC positives, MOOC negatives, networking). There was not sufficient data points to perform a Kirkpatrick evaluation. Conclusions: Semi-structured interviews for MOOC evaluation can provide a valuable in-depth analysis of learners’ experience of the course. However, there must be sufficient data sources to complete a Kirkpatrick evaluation to provide for data triangulation. For example, data from pre-course and post-course surveys, quizzes and/or test results could be used to improve the evaluation methodology. Trial Registration: The evaluation received ethical approval from the Imperial College London Education Ethics Review Process (EERP) (EERP1617-030).


Journal article


JMIR Medical Education


JMIR Publications