Feedback can be one of the most difficult parts of teaching. Giving students feedback on learning tasks and assessments, is obviously paramount to explicit instruction. However, it’s time-consuming and student’s often don’t put the same effort into reading feedback that teachers put into writing it. I have seen some teachers stay late every night, diligently marking and writing comments on every piece of work their students complete. Well, I have little doubt, that feedback can help students, I have often wondered if this practice of marking each assignment, is truly a time-efficient strategy. I cannot count the number of times, I have seen a student check their mark on a test and throw out the paper, before reading the comments. So it does beg the questions: How valuable does our feedback tend to be? And how best can we use it?
In this article, I will review three meta-analyses to try and answer these questions:
The first meta-analysis was by Fabienne Van der Kleij, et al. and published in 2015. This study examined 40 experimental and/or quasi-experimental studies, on the topic of feedback for computer-based learning. The authors looked at three types of feedback: provding an explanation, indicating if an answer was correct, or providing the correct answer. The authors measured the results using a Hedge’s g calculation. Their results can be seen below:
These results suggest that feedback is most useful when it’s immediate and descriptive. They also suggest that feedback that is delayed or that only indicates if an answer is correct is far less helpful. These results suggest that feedback is most important for older students and in math. It seems logical that the results would be higher for older students and for math. Older students might have a greater sense of self-responsibility and math can be highly procedural, meaning that small errors can cause much lower outcomes.This study did have two limitations. Firstly, it was only for computer-based learning, and secondly, only 14 effect sizes were for students in grade 12 or lower, which means the results could be unreliable for lower elementary teachers.
In 2020, Wisniewski et al. conducted a meta-analysis of 435 studies on feedback. The authors found a mean Cohen’s d effect size of .55. However the authors also included case studies that were of low quality and did not use a control group. If studies that did not include control groups were excluded a mean effect size of .42, was found. The authors also calculated several interesting moderator variables results that can be seen below.
These results, help to reinforce that feedback that is more descriptive is likely to have better results than feedback that is less descriptive. It also suggests that students might be more willing to listen to feedback if it comes from a peer. However, this meta-analysis also has several limitations. The authors included low-quality studies that inflated the results, which makes the moderator data in the above chart less reliable. Secondly, the authors did not code for grades or subjects, which means that we don’t know if these results are reliable in earlier grades. This is especially problematic when we consider that the Van der Kleij study showed lower results for younger students.
The third meta-analysis was by Fan, Et, al in 2018 and looked at 28 studies on homework. Their study showed that marked homework (r=.51) showed more than double the outcomes of homework that was not marked (r=.22). This meta-analysis also had two limitations. It included non-experimental studies and used a correlational effect size.
The above studies are not definitive, but they do suggest that feedback, is important, especially for math, and for older students. They also suggest that the importance of feedback is not to evaluate students, but rather to help them learn via explanations. However, detailed written feedback can be very time-consuming, so I thought I would offer a couple of practical suggestions, to maximize the time efficiency of feedback:
Use a Detailed Success Criteria: This front end loads feedback, so students understand the expectations.
Use Self Assessment and Peer Assessment: The strategies, don’t just cut down on marking time, they help to draw students' attention to your expectations.
Use Conferencing: In my own practice as a teacher, I love to meet with students one on one to discuss areas for feedback, as I find I can talk a lot faster than I can write and I can check for understanding.
Use Group Feedback: If you notice a lot of students are making the same mistake, rather than writing the same comment down over and over again, discuss it as a class.
Randomly Sample Questions: If you assign work, that includes a lot of practice questions, rather than checking each question, consider checking a random sample of questions.
Don’t Assign Homework, You’re Not Marking: The Fan, et al meta-analysis showed there was a minimal benefit to homework that is not marked. So if you don’t have the time to mark it, why invest the time in creating it? Instead of giving homework every night, consider giving more meaningful homework that you can take the time to mark.
It’s Not About Feedback Frequency, It’s About Feedback Quality: The above research shows there is little benefit for marking students’ work with just a checkmark or an x. So consider, giving less total feedback but focusing on providing, higher quality feedback (I think I need to work on this last one myself. I have definitely focused on frequency over quality in the past).
Written by Nathaniel Hansford
Last Edited 12/11/2022
Fan. (2017). Homework and students’ achievement in math and science: A 30-year meta-analysis, 1986–2015. Educational Research Review, 20, 35–54.
Van der Kleij, Fabienne & Feskens, Remco & Eggen, Theo. (2015). Effects of Feedback in a Computer-Based Learning Environment on Students' Learning Outcomes: A Meta-Analysis. Review of Educational Research. 85. 10.3102/0034654314564881.
Wisniewski B, Zierer K, Hattie J. The Power of Feedback Revisited: A Meta-Analysis of Educational Feedback Research. Front Psychol. 2020;10:3087. Published 2020 Jan 22. doi:10.3389/fpsyg.2019.03087