Generating actionable predictions regarding MOOC learners' engagement in peer reviews

Gomez-Sanchez, Eduardo
Er, Erkan
Bote-Lorenzo, Miguel L.
Dimitriadis, Yannis
Asensio-Perez, Juan I.
Peer review is one approach to facilitate formative feedback exchange in MOOCs; however, it is often undermined by low participation. To support effective implementation of peer reviews in MOOCs, this research work proposes several predictive models to accurately classify learners according to their expected engagement levels in an upcoming peer-review activity, which offers various pedagogical utilities (e.g. improving peer reviews and collaborative learning activities). Two approaches were used for training the models: in situ learning (in which an engagement indicator available at the time of the predictions is used as a proxy label to train a model within the same course) and transfer across courses (in which a model is trained using labels obtained from past course data). These techniques allowed producing predictions that are actionable by the instructor while the course still continues, which is not possible with post-hoc approaches requiring the use of true labels. According to the results, both transfer across courses and in situ learning approaches have produced predictions that were actionable yet as accurate as those obtained with cross validation, suggesting that they deserve further attention to create impact in MOOCs with real-world interventions. Potential pedagogical uses of the predictions were illustrated with several examples.


Collaborative peer feedback and learning analytics: theory-oriented design for supporting class-wide interventions
Dimitriadis, Yannis; Er, Erkan; Gasevic, Dragan (2021-02-01)
Although dialogue can augment the impact of feedback on student learning, dialogic feedback is unaffordable by instructors teaching large classes. In this regard, peer feedback can offer a scalable and effective solution. However, the existing practices optimistically rely on students' discussion about feedback and lack a systematic design approach. In this paper, we propose a theoretical framework of collaborative peer feedback which structures feedback dialogue into three distinct phases and outlines the ...
Capturing Peer Research Mentoring Experience in Undergraduate Education: A Qualitative Case Study Lisans Eğitiminde Akran Araştırma Danışmanlığı Deneyimi: Nitel Bir Durum Çalışması
Can, Iclal; Burakgazi, Sevinç Gelmez; Çapa Aydın, Yeşim; Coşkun, Muhammet (2022-1-01)
All rights reserved.The purpose of this study was to capture the peer research mentoring experience of pre-service school counselors who were involved in a Peer Research Mentoring Program deployed at an undergraduate level Research Methods course. We used a qualitative case study and employed criterion sampling to recruit 10 pre-service school counselors who had completed the Research Methods course at an international university in Northern Cyprus as research mentors. We paired up the research mentors with...
Informing the Design of Collaborative Activities in MOOCs using Actionable Predictions
Er, Erkan; Gomez-Sanchez, Eduardo; Bote-Lorenzo, Miguel L.; Asensio-Perez, Juan I.; Dimitriadis, Yannis (2019-01-01)
With the aim of supporting instructional designers in setting up collaborative learning activities in MOOCs, this paper derives prediction models for student participation in group discussions. The salient feature of these models is that they are built using only data prior to the learning activity, and can thus provide actionable predictions, as opposed to post-hoc approaches common in the MOOC literature. Some learning design scenarios that make use of this actionable information are illustrated.
Assessment of web-based courses: a discussion and analysis of learners individual difference and teaching-learning process
Gülbahar, Yasemin; Yıldırım, İbrahim Soner; Department of Computer Education and Instructional Technology (2002)
This study examined the role of individual differences and the quality of the teaching-learning process on learning outcomes in a web-based instructional environment, and explored the implications of these variables on the design, delivery and evaluation stages of web-based instruction. The subjects of this study were the students of two web-supported traditional courses, being one undergraduate and the other graduate, offered by the Computer Education and Instructional Technologies Department of METU. Fort...
End User Evaluation of the FAIR4Health Data Curation Tool
Gencturk, Mert; Teoman, Alper; Alvarez-Romero, Celia; Martinez-Garcia, Alicia; Parra-Calderon, Carlos Luis; Poblador-Plou, Beatriz; Löbe, Matthias; Sinaci, A Anil (2021-05-27)
The aim of this study is to build an evaluation framework for the user-centric testing of the Data Curation Tool. The tool was developed in the scope of the FAIR4Health project to make health data FAIR by transforming them from legacy formats into a Common Data Model based on HL7 FHIR. The end user evaluation framework was built by following a methodology inspired from the Delphi method. We applied a series of questionnaires to a group of experts not only in different roles and skills, but also from various...
Citation Formats
E. Gomez-Sanchez, E. Er, M. L. Bote-Lorenzo, Y. Dimitriadis, and J. I. Asensio-Perez, “Generating actionable predictions regarding MOOC learners’ engagement in peer reviews,” BEHAVIOUR & INFORMATION TECHNOLOGY, pp. 1356–1373, 2020, Accessed: 00, 2021. [Online]. Available: