There have been hundreds of journal articles published in the last five years about the use of student response systems in higher education. Most of them present positive results about ‘engagement’ based on feedback from students, but very few provide quantitative evidence of their impact on attainment.

I was, therefore, pleased to find Clickers in the Classroom: A Review and a Replication [Keough, 2012] which provides a review of 66 other studies and an in-depth study which validated the findings in a Management course. I recommend you read the paper, but Table 2: Summary of Study Criteria says it all, really:

 Criterion  Number of Samples  Significant positive outcomes
 Actual performance  34  22
 Satisfaction  47  46
 Perceived performance  37  35
 Attention span  25  23
 Attendance  24  19 (7)
 Participation  21  20
 Feedback  15  15
 Ease of use  8  8

Most studies focused on multiple outcomes, and the outcomes were predominantly positive. It is worth stressing that these results are due to the active teaching strategies that the technology enables, using in-class questions to facilitate thinking, discussion and feedback.

More recently, A meta-analysis of the effects of audience response systems (clicker-based technologies) on cognition and affect [Hunsu, Adesope and Bayley, 2015] analysed the results from 53 papers that used an experimental or quasi-experimental research design to compare outcomes for a group using ARS with a control group who did not. Again, I recommend you read the paper for the detail, but to summarise the findings:

  • clickers have a small positive effect on cognitive learning outcomes. The greatest effect was seen on higher-order learning outcomes such as critical thinking and knowledge application, and there was no effect on lower-order outcomes such as retention of subject knowledge;
  • clickers have a significant positive effect on non-cognitive learning outcomes such as engagement, participation, self-efficacy, attendance and interest in the subject.

“Instructors who would aspire to glean the potential benefits that lie within using clicker-based technologies in the classrooms would need to attentively and strategically develop effective clicker questions and creatively facilitate thoughtful discussions and feedback around such questions. Research suggests that conceptual, and not factual, questions are most effective to aid learning. In order to optimize clicker effects, instructors would not only need to commit to encouraging peer discussion and providing feedback, but such feedback would also need to be constructive and timely.”

Finally, in What’s the Payoff?: Assessing the Efficacy of Student Response Systems [Baumann, Marchetti & Soltoff, 2015] the authors rigorously control for other factors that can affect attainment, such as students’ demographic and socio-economic background. They found that there was a small but significant impact on students’ grades where the technology is used to facilitate peer learning as opposed to in-class quizzes:

“Using clickers to promote peer collaboration allows instructors to simultaneously assess current levels of understanding and enables students to use one another as resources to better understand material.”

To conclude, the research evidence strongly supports the significant positive impact of student response systems such as Vevox on a wide range of valuable but non-cognitive learning outcomes such as engagement, participation, self-efficacy, attendance and interest in the subject. In contrast, the impact on cognitive learning outcomes and attainment is small but can be maximised by using conceptual questions that facilitate peer learning discussions.

%d bloggers like this: