Peer Assessment in Senior Engineering Courses

The following post was co-authored by Dr. Andrea Bradford, Ph.D., P.Eng, Associate Professor, School of Engineering, Dr. Julie Vale, Ph.D., P.Eng, Associate Professor, School of Engineering, and Ms. Samantha Mehltretter, MASc candidate, engineering education, School of Engineering.


Enrollment in a number of senior, design-intensive (0.75 credit) engineering courses has grown to more than 80 students. With this class size, it is very challenging to provide the support required for deep learning of design competencies. In particular, large class sizes render provision of individualized, rich and robust feedback on design projects infeasible, and these projects are often a central component of senior engineering courses. A peer assessment pedagogical model provides students with an increased quantity and variety of feedback, in a timely manner. With appropriate training, the quality of the peer feedback can also be improved. Furthermore, reviewing the work of others is an active learning approach that requires higher order thinking (i.e. evaluation domain of Bloom’s Taxonomy), helps students to assess and regulate their own learning, and provides students with the opportunity to develop deeper disciplinary knowledge.

Peer assessment was implemented in ENGG 4370 (Urban Water Systems Design) initially in Fall 2016, and then again in Fall 2017 with the support of PSEER. The study aimed to determine what learning benefits were realized, gauge student receptiveness to the approach, recognize barriers for implementation and identify best practices for use of this active learning strategy. Students peer assessed two term tests, as well as three design project deliverables (two short reports and one group presentation). Research Ethics Board approval was obtained for the study and data were collected through two surveys administered before and after the peer evaluation activities.

In the 2016 cohort, approximately 36% of the class responded to both pre and post-term surveys, and then in 2017 only 17% responded to both surveys. Unfortunately, these response rates are low, which limits the researchers’ ability to generalize to the entire class. The data obtained, however, still provided some interesting results. For example, the Revised 2-Factor Study Process Questionnaire (Biggs et al., 2001) was incorporated into both surveys to assess the students’ approaches to learning. While neither cohort exhibited a significant change in their deep learning scores over the term, it was found that students on average already had a greater tendency towards a deep approach to learning. While the sample size is small, this finding may suggest that other courses within the engineering curriculum are fostering deep learning in students.

The student perception of the peer assessment approach was also interesting, especially their views regarding learning. Figures 1 & 2 show that the student perception regarding peer assessments and learning were positive. Further, even the perceived quality of the peer technical feedback was “very helpful” or “helpful” for 41% of participants in 2016, and 40% in 2017. Similarly, the communication peer feedback was “very helpful” or “helpful” for 38% of participants in 2016, and 67% in 2017. While other questions demonstrated a more negative perception towards the peer assessment method (i.e. 48%, and 40% of participants in 2016 and 2017, respectively felt “[their] responsibilities as a peer took a lot of time”), the overall perception appeared to be positive.

Due to the apparent success of this active learning strategy in ENGG*4370, the peer assessment model is being used in the Fall 2018 offering. Some changes have been made to improve the quality of the study, including efforts to increase response rate. Furthermore, the study is expanding to include written reflections, as well as assessment of student professional skills. 

Two pie charts displaying student responses to "how much would you have learned if peer assessment was not part of the course?" in 2016 and 2017. Answers include "Much less", "Less", "Same", "More", and "Much More". Majority of responses in 2016 were "Same", followed by "Less", "More", and finally "Much more". Majority of responses in 2017 were "Less", followed by "Same", "More", and "Much less".
Figure 1: 2016 & 2017 cohort responses to the question "How much would you have learned if peer assessment was not part of the course?"

Two pie charts outlining student responses in 2016 and 2017 to the statement "I found assessing the work of my peers to be a valuable experience". Answers include "Strongly Agree", "Agree", "Neutral", "Disagree", and "Strongly Disagree". Majority of responses in 2016 were "Agree", followed by "Neutral", "Disagree", "Strongly Agree" and "Strongly Disagree". Majority of the responses in 2017 were "Agree", followed by "Strongly Agree", and "Neutral".

Figure 2: 2016 & 2017 cohort responses to the statement "I found assessing the work of my peers to be a valuable experience"