Case study: Peer feedback on student presentations

/Case study: Peer feedback on student presentations

Case study: Peer feedback on student presentations


Dr Flora Cornish
Department of Methodology

Course Name: MY405 – Research Methods for Evaluation in Health, Development and Public Policy

Level: Masters and PhD level

Class size: 30 (2 x seminar groups of 15 each)

Learning objectives

  • Communication
  • Critical Thinking
  • Engagement
  • Ethics/social responsibility

Assessment type: Group & formative

Duration: Immediate

Supportive documentation

If you would like LTI to help you do what Flora did, contact lti.support@lse.ac.uk

Return to case studies

Dr Cornish says…

I asked students to to provide constructive feedback on their peers’ group presentations. In real-time, during the seminar presentations, students accessed an evaluation form through their own smartphones, tablets or computers. Teammates allowed us to present rating scales and free text boxes to students, and to assign particular students to particular groups for feedback. Presenters received average scores on the rating scales and written comments.

The assessment provided students the opportunity to (i) critically assess others’ presentations, to become more aware of what makes a good presentation, and (ii) receive constructive and evaluative feedback on their own presentations. They got feedback both on presentation style and content.

The best part of the assessment was the written feedback given by students to each other. On the rating scales, they tended to be generous so they were not very informative (but they were encouraging which may be a good thing!). But there was really thoughtful and helpful feedback in the written text, such as suggestions for improvement, queries and identification of unclear sections.

Next time I will construct the criteria in discussion with the students. The course is about evaluation methodologies, so it makes perfect sense to discuss the evaluation criteria. I will also be more confident about the implementation. The software is straightforward. But because it was new to all of us, and there was no time to practice with it, I was slightly anxious about it actually working in real-time, and did a lot of checking with the students that it was working and they knew what they were doing. It went completely smoothly and I’d be more confident next time.

Students participating in this assessment found the technical aspect straightforward. The idea of peer assessment slightly upped the ante for performance anxiety, but they were a small and cohesive group. They engaged positively with the task, and appreciated its value. They said that using the rating scales felt a bit superficial, but perhaps that is what to expect of critically-thinking evaluation students!