This article analyzes the implementation of Socrative, an online student response system, in a first-year Biology Statistics course through a mixed-methods design that combines two quasi-experimental interventions with surveys and a teacher interview. Sixty-six students were divided into experimental and control groups across two Socrative applications.
Quantitative effects on performance were assessed using independent-samples t-tests, while qualitative perceptions were collected through the Wash questionnaire for students and the Cheung et al. instrument for instructors, complemented by a semi-structured interview. The results show an asymmetrical pattern: significant improvements in the first intervention but not in the second, alongside consistently positive evaluations of motivation, participation, feedback quality, and classroom climate.
From an applied perspective, the study offers a replicable model for integrating Student Response System (SRS) tools into university courses without reconfiguring the curriculum. Short, time-limited quizzes are administered mid-unit, featuring configurable multiple-choice, true/false, and short-answer items delivered on students’ own devices, with immediate aggregated feedback to adjust instruction.
The evidence suggests that even when summative performance gains are inconsistent, formative benefits remain robust, including more active facilitation by the instructor, greater student engagement, and a higher and more accessible perceived level of learning. For quantitatively demanding subjects, the design details provided—such as timing, item formats, and contingency planning for connectivity issues—are especially actionable.
Future research should strengthen causal inference and unpack underlying mechanisms. First, extending the number of implementations with randomized or rotated section designs could mitigate self-selection and potential bias due to differences in unit difficulty. Second, it would be valuable to isolate which Socrative features drive the effects—such as immediacy of feedback, anonymity options, or question type—through factorial designs. Third, research should move beyond final grades to include process indicators like retention, transfer, response latency, and error patterns, as well as equity perspectives that identify who benefits most or least depending on prior preparation or digital literacy. Finally, comparisons between Socrative and other SRS platforms, controlling for difficulty and feedback timing, could determine whether the observed benefits are tool-specific or inherent to the broader practice genre.
---
How to Cite: Juan Llamas, C., & de la Viuda Serrano, A. (2022). Using Socrative as a Tool to Improve the Teaching-Learning Process in Higher Education. RIED-Revista Iberoamericana de Educación a Distancia, 25(1), 279–297. https://doi.org/10.5944/ried.25.1.31182
