All too often students responding to feedback is ineffective. It is worth considering: is my feedback working?
At the end of a topic my class will typically do some form of short test. As you mark a test paper you quickly come to realise which questions students are performing well on and which ones they are not: both giving an overall trend and individual trends. If there is a question that many students have struggled with there are a few options as to what might have caused this:
- My teaching of that area
- Difficulty in accessing the question (e.g. it is a poorly written question or they are not well-prepared to tackle the style of question they’ve been given)
- The concept is simply a tricky one and the question was hard as it required them to apply their understanding in a complex way
Individually students will struggle with different things and this could be for any of the reasons above, and perhaps includes further reasons such as the way the student has studied (which can be influenced by oh so many factors). It is our job to ensure we understand why students have not achieved more highly and to support them in their development of these areas. It is our students’ job to reflect on their achievement, to take responsibility and to do their best to get even better.
What feedback is worth giving?
I guess by “is my feedback working?” in the title of the post I mean, “do the students learn from their mistakes?” or “do the students do better next time?” I feel it is important that I help my students develop an understanding of where they have gone wrong, and why, and how they could study to improve that. But that’s hard. Really hard. Whilst I am about to run through a few of my ideas, I am not saying that they work and I’ve performed a school-wide detailed research study to show it – I want you to decide whether these things are worth trying and whether you think they could be effective for you. These ideas do apply across subjects but some would need tweaking, so apologies for the sciencey slant.
*Note: everything here uses mail merge from my Excel markbook, so the time spent on each is very reasonable.
I tend to do the following:
I always do question-by-question analysis as it helps make it clear to me how the group has performed overall and how individuals have done, but I also allow them to see their analysis and the class average. The class average, in my view, helps contextualise the test as they are not all magically of the same difficulty, and so achieving 80% in two tests in a row might be in line with the average for one test and well above/below in another.
Does it help students learn from their mistakes? Anecdotally, they seem to respond well to knowing whether the test was ‘easy’ or ‘hard’ and many suggest they like knowing how they have done compared to averages in specific questions as they can identify what they have slipped up on that others haven’t (and for those that have done well, they can celebrate their successes). That does help some students identify areas they need to work on compared to the group and gives an individual some areas for development. I wonder whether the positive impact outweighs any negative impact it might have on those who have performed poorly across multiple questions?
For my GCSE groups I have a collection of comments for each question and I Mail Merge these onto a sheet. Essentially, if someone slips up on a question then it will come up with a comment for that question, and if they have several questions that they need to work on then they will have multiple ‘individual’ comments. Certainly they feel as though I have put in a lot of effort to give them individual feedback (even if I tell them it is just my mail merge from a spreadsheet), and it gives them plenty of individual tips and advice on what to do to improve. Generally these are things that they can take on board ‘for next time’.
As with everything in this post, I have no real measure of their effectiveness in helping students make improvements. Do they actually use these comments? Does the positive impact of me showing that I care have a significant impact on how motivated they are in my subject? Perhaps they won’t look again at the specific test comments but they will revise more for future tests?
Some students have trends in where they got questions wrong – for example, they may have not stated a correct equation multiple times, they may have misread the command word in questions over and over again, or maybe they repeatedly failed to write enough for the marking points available. Sometimes I use a Response Analysis approach where for each of the questions, or a chosen selection of difficult questions, students have to read the question and their response and decide on a reason as to why they got the question wrong. By doing this, they may identify trends in what they have struggled with and it could support them in their preparation for future tests.
Does this work? Will students take on board these trends? You would need to spend time getting students to use this information to set their own targets and to review these before your next test. This way they are reminded of what they can work on to better prepare themselves, and then you must review against post-test to determine whether there has been any improvement.
This makes a lot of sense for those questions where the whole class has struggled. You can review the content with the whole group, ensure they have an understanding of it, then get them to re-do the question or to tackle a similar example. It would be crazy not to do this before you move on to another topic. I tend to get them to tackle a similar question as it avoids a learned response being repeated; doing something different encourages them to think harder and demonstrate their understanding rather than simply use their memory skills.
Does it work? How do you make feedback stick?
The ideal scenario is that you give students some feedback, they take it on board and they refer to it when studying in the future and review how they perform in future assessments considering these previous areas that needed development from feedback. This generally does not happen magically, and particularly so in the younger years.
The real question is: how do I ensure students do this? It’s not about this ‘reacting to feedback’ thing that people say. A lot of my students react to my feedback because I tell them they have to respond to it. This doesn’t necessarily make them learn more (though I hope my feedback is written in such a way as to expect them to), and it absolutely doesn’t turn them into reflective learners who will regularly consider my feedback in the future in order to make long-term improvements in the work that they do. This comes from giving the students high quality feedback, students setting relevant targets that are written in such a way as to allow them to apply this logic to future assessments, and then to regularly refer to these in lessons, giving time to review these before an assessment and then reflecting on them once again after the assessment. If the feedback and targets don’t allow that, then apart from a few of our students…you’ve really wasted your time.
My latest efforts
Something I have tried with my Year 10s, who I have given a lot of feedback to from tests and homework, is to summarise all of the comments and assessment results together into one double-sided sheet and a space to write in targets. They’ll keep this and we will refer to it regularly so to develop the skills and knowledge that they need, based on their own targets set following my feedback. It’ll be interesting to see if it actually improves their performance in their chosen areas.
This is nothing new, but I think it is important to be careful of thinking that students responding to feedback actually means they’re doing something useful. Let me know your thoughts.
4 Replies to “Is my feedback working?”
Big fan of the idea of showing class averages for questions, students are always desperate to know how they compare to the rest of the class and it’s invaluable for the teacher in planning follow up activities.
How long did it take you to set most of this up? Generating spreadsheets and mail merges seem like a great way of easing the process of giving feedback, but I always prioritise other tasks for fear of not getting more ‘important’ things sorted in time.
Hope all is well!
Thanks for the comment Michael!
I think the averages are an important part of the process. An important part of building some resilience for those who have performed below what they had hoped, and as you say it is helpful for us in preparing the follow up.
Time-wise, if it’s just question by question it takes 10-15 mins, and I just copy and paste the previous feedback sheet and amend the fields. If it’s comments too, that’s another 10-15 – especially as they’re copy and pasted and amended depending on questions that they have got wrong. If it’s the response analysis or difficult question stuff then that takes a variable amount of time – maybe 10 mins to come up with some reasons or difficult question alternatives.
The summary sheet of everything was just a mail merge of all feedback into one so is only a few minutes to format the feedback sheet. I don’t mark their books, ever, so this takes that time that I’ve saved! Then I have some more time leftover for a coffee 😉
With the right support I think it’s a great method for encouraging improvement, and getting students to take responsibility for their own learning. This is definitely a system that can help to highlight and solve problems early on, looking topic by topic at what level of understanding a student has reached. I can particularly see how it works at the upper secondary level, when the students are more autonomous and must be self motivated, but is this a system that you feel could be put in place effectively in the lower school, or in a primary school setting? And how effective is it with students that don’t have the motivation to improve based on this feedback?
Thanks for the comment Kate. I still find feedback a challenging area to think about, and I still question regularly how effective mine is! I think it all revolves around the question of whether students are genuinely engaging in the feedback. If the feedback is great, student response to your feedback should be enough to have that engagement (for example, getting them to complete a rephrased question with some hints in Maths, or rewriting a sentence/paragraph with some hints in most subjects). This is essentially what you’ve pointed out by saying upper secondary and those who are more autonomous. It’s that self-motivation that causes them to engage, and if you have the right culture in your school/community/classroom you can certainly encourage that approach lower down the school. If students are generally uncaring towards their studies, perhaps an issue with whole-school engagement, then there are some bigger questions and more serious hurdles to tackle first. Though, having good feedback there and expecting highly of the students is always a good thing.
Primary…well, I’m no expert, but if you want students to make specific improvements where written feedback could provide some of the support, this is one way. It works for my Year 7s, which is not too far removed from Year 6!
And with those who aren’t motivated, it at least seems to suggest to them that you care about how they do. Students are often excited about the sheet because it is personalised and they seem to appreciate that you have worked hard to feed back to them. That certainly helps, and if the feedback is targeted and accessible then students are more inclined to give it a go.