Increasing Student Response Rates to SLEs (end of term surveys)
Why should I (as a course instructor) care about response rates for my SLEs?
In general, the higher the response rate, the more valid and reliable the information. It is important to consider response rate when interpreting scores for a number of reasons. For example, lower rates may more strongly bias the results due to the influence of outlier responses. Additionally, a low response rate may represent a biased non-probabilistic sample. The results may not be a good representation of the class. And there may be a larger margin of error. In general, the larger the class size, the lower the response rate needed for providing statistical validity (Zumrawi et al, 2014). Having more responses from students about their experiences in your courses has many benefits. These include:
-
acquiring a more accurate picture of the diverse student experiences in your course(s)
-
diluting the effect of extreme responses
-
helping us refine the SLE questions and help determine if this tool is right for OSU
-
helping faculty develop high quality teaching and improve student learning
-
helping the administration for the purposes of evaluating teaching for promotion and tenure
Why don’t students complete the surveys?
Research has identified many reasons that students fail to complete such surveys, including:
-
apathy
-
technical problems
-
perceived lack of anonymity
-
lack of importance
-
inconvenience
-
inaccessibility
-
lack of time for completion
(Adams & Umbach, 2012; Avery, Bryan, Mathios, Kang, & Bell, 2006; Ballantyne, 2002, 2003; Dommeyer, Baum, & Hanna, 2002; Sorenson & Reiner, 2003)
What can I (as a course instructor) do to increase response rates for my SLEs?
Suggestions to increase response rates that have worked well for faculty at OSU and other large institutions like OSU include (Chapman & Joines, 2017; Young, et al, 2018):
-
Create a class climate that reflects mutual respect between instructor(s) and students
-
You can do this by sharing with them that you value their perspectives, how you will use their feedback to refine the course, or that future OSU students will benefit from their feedback.
-
Communicate to your students about the importance of their feedback and how you will use it to modify your course(s).
-
You can do this by making announcements during synchronous meetings, sending emails to your students, or posting announcements via Canvas. It might be helpful to share how previously collected student feedback has impacted your course or teaching and convey how you plan to use their feedback. Here is some sample language (see additional language at bottom of page to copy and paste):
-
Your feedback is very important to me because I/the teaching team value your opinion and want to create the best possible course.
-
I respect your perspective and want to hear from you about our course.
-
I’ve shortened some readings based on previously shared comments.
-
I/the teaching team will use your feedback to identify areas of the course and my teaching that I can adapt and enhance in future terms.
-
Your feedback will be used to improve student learning in the future (you benefit from others who have provided feedback).
-
Collect and respond to student feedback earlier in the term. By doing a midterm collection, you are provided time to respond to students’ input. This reinforces the value of their voice and confirms your commitment to not only listen but act upon their input.
-
It is extremely important to share some interesting results with your students.
-
Share with them what you heard as well as actions you are not able to take and why (because they may be outside your locus of control).
-
Share with students what actions you will take immediately versus those that will take time, and why.
-
It might be helpful to ask a colleague to collect this information or connect with the Center for Teaching and Learning to assist.
-
Remind students of the way to complete SLEs:
-
SLEs are available at this link: https://beav.es/Student-Learning-Survey from Wednesday of Week 9 through Sunday before final exams. Post this link to your Canvas site in an announcement or on your course landing page (see language at bottom of page to copy and paste).
-
Remind them that their responses are anonymous and that you will not see their responses until after final grades have been submitted
-
Make time during your course for your students to complete the survey
-
Allow 10 minutes during a synchronous class session to permit students to complete the survey using their electronic devices
-
Add in a no credit course “assignment” or event into the Canvas calendar to remind students to complete the SLEs; put a link to the surveys in the assignment or event description
-
Monitor response rates by logging into https://beav.es/Student-Learning-Survey and hovering over the course name to see current response rate.
-
Provide reminders to students as response rates change
-
We do not recommend using grade altering incentives to increase response rates. Some considerations:
-
The use of incentives for completion of SLEs may be perceived as coercion by students, suggesting that they will be negatively affected should they not complete the survey
-
When course credit (or extra credit) is offered for student responses to SLEs, doing so may bias student responses
-
Studies have shown that response rates for faculty offering incentives often do not differ from response rates for those employing the methods suggested above
-
Should credit be offered for completion of SLEs, it is appropriate to provide students with an alternate way to earn the same credit without completing the SLE. Check with Department Head before offering incentives.
What can I (as an administrator) do to increase response rates for my faculty?
There is an abundance of research on all aspects of student evaluation of teaching effectiveness. The consensus in the literature is that while student evaluations are the most common strategy of evaluation, by themselves they are not sufficient to provide a complete evaluation of teaching (see Boysen, 2016;for recommendations). A collection of course materials provides an efficient way to document teaching effectiveness as part of the dossier (Bernstein et al., 2006; Richmond et al., 2022). As stated in the OSU faculty handbook and featured in the Faculty Senate approved Quality Teaching Framework, self-reflections, peer observations, and other evidence of teaching rigor are key parts of a teaching dossier. Students, however, are in a unique position to make observations and are an appropriate source of information when they are judging student-instructor relationships, organization of the course, their views of the instructor’s professional and ethical behavior, their workload, what they have learned in the course, fairness of grading, and the instructor’s ability to communicate. They are not good sources from which to judge relevance and recency of course content, and knowledge and scholarship of the instructor. https://www.vpfa.psu.edu/files/2016/09/srte_statement-248pj9j.pdf
Administrators interested in raising response rates should:
-
Reassure faculty that SLE numbers alone will not determine employment decisions and that other indicators of teaching effectiveness will be considered in the unit for employment decisions, including:
-
Peer review of teaching conducted regularly of both in person teaching and course materials (easily collated in a dossier or portfolio)
-
Faculty self reflection of teaching effectiveness
-
Reassure faculty that SLE numbers will be examined in context that includes:
-
Level of course (upper division courses typically have higher SLE scores)
-
Major vs non-major courses (major courses typically have higher SLE scores)
-
Response rates (see above)
-
External factors such as the pandemic and personal issues
-
Create and offer a paper-and-pencil option or ensure accessibility to technology for in-class administration of the SLE. In general, response rates for online administrations have been consistently lower (Anderson, Cain, & Bird, 2005; Benton, Webster, Gross, & Pallett, 2010; Heath, Lawyer, & Rasmussen, 2007)
-
Specify the purpose(s) of the ratings such as teaching improvement, salary, promotion, tenure (Berk, 2006)
-
Demonstrate the value and importance of teaching in unit by:
-
Discussing issues of teaching at unit meetings
-
Acknowledging teaching accomplishments and strategies for success
-
Inviting Center for Teaching and Learning staff to share core resources
-
Begin a campaign. Communicate the value and importance of students completing the SLE (Berk, 2006; Johnson, 2003; Sorenson & Reiner, 2003) by using all available communication channels:
-
Post ad campaigns on bulletin boards and in student publications (The IDEA Center, 2008)
-
Post QR codes that link to information about course evaluations and your support of them.
-
Add course evaluation information to unit web pages.
-
Provide faculty with a Canvas page that discusses the importance of SLEs and ask them to include the page in their courses and on their syllabus. This page should also include directions for accessing and completing the survey (Dommeyer et al., 2004; Johnson, 2003; Norris & Conn, 2005)
-
Provide announcements for instructors to distribute in the classroom or by email.
-
Share the evaluation schedule and remind students when the evaluations are open.
It is important to note here that no single strategy can address all of the reasons students choose not to complete end-of-term surveys. And there is no combination of strategies that work as a best-practice. However, the combination that is adopted should be clearly communicated and hold the commitment of all stakeholders in order to significantly minimize the students’ lack of participation.
Language for Instructors to share with students:
-
In person announcement: “This year, OSU is launching our new Student Learning Experience Survey (SLE) to ask about your perceptions and observations of our course. Your feedback is very important to me because [I/the teaching team value your opinion and want to create the best possible course - OR - I respect your perspective and want to hear from you about our course - OR - something similar that conveys mutual respect between instructor and students]. I/the teaching team will use your feedback to identify areas of the course and my teaching that I can adapt and enhance in future terms. The SLE is available for you to complete online now until Sunday, March 13 through MyOregonstate. I know this is a busy time of year, but your feedback is appreciated and important.”
-
“Dear Students, This year, OSU is launching our new Student Learning Experience Survey (SLE) to ask about your perceptions and observations of our course. The new SLE questions are meant to better reflect OSU’s values and Quality Teaching Framework (for more information about these changes, see here). Your feedback is very important to me because [I/the teaching team value your opinion and want to create the best possible course - OR - I respect your perspective and want to hear from you about our course - OR - something similar that conveys mutual respect between instructor and students]. I/the teaching team will use your feedback to identify areas of the course and my teaching that I can adapt and enhance in future terms. The SLE is available for you to complete online now until Sunday, March 13, through my.oregonstate.edu. I know this is a busy time of year, but your feedback is appreciated and important. To access the SLE please use this link: https://beav.es/Student-Learning-Survey”
Citations:
Adams, M. J. D., & Umbach, P. D. (2012). Nonresponse and online student evaluations of teaching: Understanding the influence of salience, fatigue, and academic environments. Research in Higher Education, 53(5), 576–591. (DOI: 10.1007/s11162-011-9240-5)
Anderson, H. M., Cain, J., & Bird, E. (2005). Online student course evaluations: Review of literature and a pilot study. American Journal of Pharmaceutical Education, 69(1), Article 5, 34–43. (http://web.njit.edu/~bieber/pub/Shen-AMCIS2004.pdf)
Avery, R.J., Bryan, W.K., Mathios, A., Kang, H., & Bell, D. (2006). Electronic course evaluations: Does an online delivery system influence student evaluations? Journal of Economic Education, 37(1), 21–37. (DOI:10.3200/JECE.37.1.21-37)
Ballantyne, C. (2002, November). Why survey online? A practical look at issues in the use of the internet for surveys in higher education. Paper presented at the annual meeting of the American Evaluation Association, Honolulu.
Ballantyne, C. (2003). Online evaluations of teaching: An examination of current practice and considerations for the future. In D. L. Sorenson & T. D. Johnson (Eds.), Online student ratings of instruction (New Directions for Teaching and Learning, No. 96)(pp. 103–112). San Francisco: Jossey-Bass
Benton, S. L., Webster, R., Gross, A., & Pallett, W. (2010). An analysis of IDEA Student Ratings of Instruction using paper versus online survey methods (IDEA Technical Report No. 16). Manhattan, KS: The IDEA Center.
Berk, R. A. (2006). Thirteen strategies to measure college teaching: A consumer’s guide to rating scale construction, assessment, and decision making for faculty, administrators, and clinicians. Sterling, VA: Stylus.
Berk, R. A. (2012). Top 20 strategies to increase the online response rates of student rating scales. International Journal of Technology in Teaching and Learning, 8(2), 98-107.
Bernstein, D., Burnett, A., Goodburn, A., & Savory, P. (2006). Making Teaching and Learning Visible: Course Portfolios and the Peer Review of Teaching. Industrial and Management Systems Engineering Faculty Publications. https://digitalcommons.unl.edu/imsefacpub/64
Boysen, G. A. (2016). Using student evaluations to improve teaching: Evidence-based recommendations. Scholarship of Teaching and Learning in Psychology, 2(4), 273–284. https://doi.org/10.1037/stl0000069
Carpenter, S. K., Witherby, A. E., & Tauber, S. K. (2020). On students’ (mis)judgments of learning and teaching effectiveness. Journal of Applied Research in Memory and Cognition, 9(2), 137–151. https://doi.org/10.1016/j.jarmac.2019.12.009
Chapman, D. D., & Joines, J. A. (2017). Strategies for Increasing Response Rates for Online End-of-Course Evaluations. International Journal of Teaching & Learning in Higher Education, 29(1), 47–60.
Dommeyer, C. J., Baum, P., Hanna, R. W., & Chapman K. S. (2004). Gathering faculty teaching evaluations by in-class and online surveys: Their effects on response rates and evaluations. Assessment & Evaluation in Higher Education, 29(5), 611–623. (DOI:10.1080/02602930410001689171)
Johnson, T. D. (2003). Online student ratings: Will students respond? In D. L. Sorenson & T. D. Johnson (Eds.), Online student ratings of instruction (New Directions for Teaching and Learning, No. 96)(pp. 49–60). San Francisco: Jossey-Bass.
Norris, J., & Conn, C. (2005). Investigating strategies for increasing student response rates to online delivered course evaluations. Quarterly Review of Distance Education, 6(1), 13–29.
Richmond, A. S., Boysen, G. A., & Gurung, R. A. R. (2022). An Evidence-Based Guide to College and University Teaching: Developing the Model Teacher (2nd ed.). Routledge. https://doi.org/10.4324/9781003119562
Sorenson, D. L., & Reiner, C. (2003). Charting the uncharted seas of online student ratings of instruction. In D. L. Sorenson & T. D. Johnson (Eds.), Online student ratings of instruction (New Directions for Teaching and Learning, No. 96)(pp. 1–29). San Francisco: Jossey-Bass.
The IDEA Center (2008). Facilitating response rates in IDEA online. Manhattan, KS: The IDEA Center. Retrieved March 5, 2012, from http://www.theideacenter.org/ OnlineResponseRates. Young, K., Joines, J., Standish, T., & Gallagher, V. (2019). Student evaluations of teaching: The impact of faculty procedures on response rates. Assessment & Evaluation in Higher Education, 44(1), 37–49. https://doi.org/10.1080/02602938.2018.1467878
Zumrawi, A. A., Bates, S. P., & Schroeder, M. (2014). What response rates are needed to make reliable inferences from student evaluations of teaching? Educational Research and Evaluation, 20(7–8), 557–563. https://doi.org/10.1080/13803611.2014.997915