HomeBlogFeedback Assessing Student Feedback Literacy by Aaron TrockiMarch 11, 2025 Share: Section NavigationSkip section navigationIn this sectionBlog Home AI and Engaged Learning Assessment of Learning Capstone Experiences CEL News CEL Retrospectives CEL Reviews Collaborative Projects and Assignments Community-Based Learning Diversity, Inclusion, and Equity ePortfolio Feedback First-Year Experiences Global Learning Health Sciences High Impact Practices Immersive Learning Internships Learning Communities Mentoring Relationships Online Education Place-Based Learning Professional and Continuing Education Publishing SoTL Reflection and Metacognition Relationships Residential Learning Communities Service-Learning Student Leadership Student-Faculty Partnership Studying EL Supporting Neurodivergent and Physically Disabled Students Undergraduate Research Work-Integrated Learning Writing Transfer in and beyond the University Style Guide for Posts to the Center for Engaged Learning Blog In the last blog post, I discussed my initial plans for feedback this semester. A few insights have helped me to plan effective feedback strategies. The feedback loop (Clark and Talbert 2023) structures students’ engagement with feedback they receive to make changes in their learning approaches and then reflect on the effectiveness of those changes. The Feedback Literacy Behavior Scale The feedback loop has helped me to think about feedback use in cycles, in contrast to feedback approaches that simply justify a mark or grade. Wood’s description of learning outcomes that are specific and based on operational verbs has helped me reconsider how explicitly I convey my expectations for my students to demonstrate their learning. Finally, the construct of feedback literacy, which Carless and Boud (2018) define as, “the understandings, capacities and dispositions needed to make sense of information and use it to enhance work or learning strategies” (1316), made me think about how to promote this type of literacy throughout the semester. At the time of this writing, we are in the third full week of the semester, and I am in the early stages of enacting my plan to enhance feedback strategies and practices for my students. As the semester began, I sought to know about students’ experiences with feedback. I found Dawson et al.’s (2023) article, “Measuring What Learners Do in Feedback: The Feedback Literacy Behaviour Scale,” which introduces a feedback literacy behavior scale (FLBS). This self-report instrument is intended to measure students’ behaviors related to feedback rather than their perceptions or orientations. Their scale is organized along five factors: Seek feedback information Make sense of information Use feedback information Provide feedback information Manage affect To use the FLBS, students use the following scale to rank how often they exhibit particular behaviors associated with each feedback factor: (1) never, (2) almost never, (3) rarely, (4) sometimes, (5) almost always, (6) always. You can learn more about how the FLBS was developed and how their work progressed at feedbackliteracy.org. The FLBS is available freely under a Creative Commons Attribution (CC-BY) license. Student Responses In the first week of the semester, I invited each of my students in one undergraduate calculus course to anonymously complete the FLBS. I added the following two open-ended prompts to the end of the FLBS to allow students some space to explain their perceptions of feedback practices. Explain what kind of feedback you have received from your professors during your college experience. Did the feedback help you learn? What could your professors do to improve the feedback they give you? My goal was to assess the current levels of feedback literacy of my students along with their perceptions of feedback. I planned to use the information gained to better meet their learning needs regarding feedback practices as the semester progressed. This research was approved by my institution’s IRB. I have 23 students in this class and 20 agreed to complete the FLBS. I analyzed the student responses on the FLBS by calculating average ratings on each subsection of the FLBS (e.g., factor 1: Seek feedback information) and the overall average rating for each student. Then, I calculated average ratings on each subsection of FLBS and the overall average rating for the group of 20 students. Results for the FLBS with factor labels and descriptions are summarized in the table below. FLBS Factor and Description Student Averages1) Seek feedback information: eliciting feedback information from a variety of sources, including one’s own notions of quality and examples of good work 4.63 2) Makes sense of information: processing, evaluating, and interpreting feedback information 4.84 3) Use feedback information: putting feedback information into action to improve the quality of current and/or future work 4.54 4) Provide feedback information: considering the work of others and making comments about its quality 4.72 5) Manage affect: persisting in feedback processes despite the emotional challenges they pose 5.31 Overall student average4.81Table model adapted from Dawson et al. 2023 I was somewhat surprised to find that the factor averages were this high, yielding an overall average of 4.81. Factor 3 (Use feedback information) had the lowest average and factor 5 (Manage affect) had the highest average. Based on these findings I suspected that my students needed the most help with seeking (factor 1) and using (factor 3) feedback information. I interpreted this result as students needing guidance with how to start and close the feedback loop. Clark and Talbert 2023 In addition to analyzing and summarizing data from the FLBS, I also analyzed student responses to the two open-ended questions. I read through student responses multiple times and identified themes across responses. Two themes emerged for each open-ended question. Most frequently reported themes are reported first. Open-Ended Question Emergent Themes Explain what kind of feedback you have received from your professors during your college experience. Did the feedback help you learn? Written feedback from instructors on writing assignments was helpful to improve re-writes or next writing submission Feedback usually helped but… need more feedback need details on why work was good/poor and how to improve need feedback earlier What could your professors do to improve the feedback they give you? Clear and specific feedback is needed Vague grading criteria makes it difficult to use feedback Themes related to these open-ended questions emphasized what instructors can do to improve feedback practices. As I analyzed student responses to extract these themes, I thought about additional questions that I could have asked, such as, “How have you advocated for yourself in seeking meaningful feedback in college courses?” and “Have you received feedback from your peers that has helped you learn/improve? Explain.” Conclusion To summarize, I found the results of the FLBS and these open-ended questions interesting and informative. On the FLBS, my students reported that they often can make sense of feedback and can definitely manage affect. However, there was some evidence of needing guidance in seeking feedback and putting the feedback to good use. I was surprised that all the subcategory averages were above 4.5. One critique of the FLBS is that it is a self-reporting instrument, which I suspect may lead to inflated ratings and averages. Dawson et al. (2023) recognize the potential limits on accuracy when using the FLBS and recommend mixed-method approaches to studying its use. After administering the FLBS, I thought that there may be value in just having students complete the FLBS to make them aware of feedback literacy. Data on students’ perceptions of feedback was also helpful in considering their past experiences and perspectives. The insights I gained from analyzing this data were limited and a more thorough investigation would shed more light on students’ beliefs and perceptions in this regard. What findings would you predict if your students completed the FLBS? How might these results affect the feedback practices you foster in your courses? The helpful takeaways I gained from assessing students’ feedback literacy have influenced the way I plan for teaching and learning in my calculus course. In the next blog post, I will share some of the ways I have revised my feedback practices with an eye to how these ways are affecting feedback literacy and student learning this semester. References Carless, David, and David Boud. 2018. “The development of student feedback literacy: Enabling uptake of feedback.” Assessment and Evaluation in Higher Education 43(8): 1315-1325. https://doi.org/10.1080/02602938.2018.1463354. Clark, David, and Robert Talbert. 2023. Grading for Growth: A Guide to Alternative Grading Practices that Promote Authentic Learning and Student Engagement in Higher Education. New York: Stylus Publishing, LLC. https://doi.org/10.4324/9781003445043. Dawson, P., Yan, Z., Lipnevich, A., Tai, J., Boud, D., and Mahoney, P. 2023. Measuring what learners do in feedback: the feedback literacy behaviour scale. Assessment & Evaluation in Higher Education. https://doi.org/10.1080/02602938.2023.2240983. Driscoll, Amy, Swarup Wood, Dan Shapiro, and Nelson Graff. 2021. Advancing Assessment for Student Success: Supporitng Learning by Creating Connections Across Assessment, Teaching, Curriculum, and Cocurriculum in Collaboration with Our Colleagues and Our Students. Routledge.https://doi.org/10.4324/9781003442899. About the Author Aaron Trocki is an Associate Professor of Mathematics at Elon University. He is the CEL Scholar for 2023–2025 and is focusing on models of assessment and feedback outside of traditional grading assumptions and approaches. How to Cite this Post Trocki, Aaron. 2025. “Assessing Student Feedback Literacy.” Center for Engaged Learning (blog), Elon University. March 11, 2025. https://www.centerforengagedlearning.org/assessing-student-feedback-literacy.