Posted by: Meredith Seaman, Senior Advisor, Learning and Teaching, College of Design and Social Context, RMIT University.
We talk quite a lot about student feedback here on the blog, but the specific aspect I want to consider here is appropriate, fair and reliable ways to administer institutional surveys on good teaching to students. In particular I want to examine what could be done to increase survey responses so the results are more meaningful. The challenge in getting students to participate is more significant since surveys moved online, despite the obvious benefits of ‘anywhere/anytime’ for students to complete the survey instruments, and associated efficiencies for the Institution.
With low response rates it can be just a waste of everyone’s time as we need a certain sample size to viable and useful.
Investigations from the period of transition when many surveys moved online in the United States emphasise the following:
- … “an important factor in response rates is students’ belief that rating results are used for important decisions about courses and faculty.” (Ballantyne, 1999)
- Institutions should “encourage instructors to show a personal interest in students completing the forms (e.g., instructors could mention the forms in class, let students know that they pay attention to student responses, or send personal emails to students reminding them to complete the forms)” (Johnson, 2002)
While I’d definitely think twice about sending a personal email to your students, research is consistent that it is important for teachers to set up the context for students to complete the survey. For response rates to increase strategies are required at a range of levels. There is however, consensus that teachers have a role.
Given that the challenge is bigger than an institution sending out disembodied emails or offering iPads. How might we make it a meaningful process, a better experience for our students, and get more reliable data?
Pulling together different ideas from Schools across our College, to do it well, we need to show we are listening. The in-class process might go something like this:
Sowing the seed
Before survey time sow the seed early on about the importance of their ‘feedback’. You could highlight earlier in the course that such surveys inform a range of activities and decisions from university management down to the classroom. You could highlight a couple of specific changes that you have made in response to past survey responses and Student Consultative Committee discussions and so on.
Spending time with your students unpacking the notion of ‘feedback’ more generally is another idea. You might want to emphasise the different types of feedback that students get in your course, such as: from peers, on learning activities, on assessments tasks etc. You could also build on work you are doing with students on their skills to provide constructive feedback, such as giving peer review and feedback on learning activities and assessment tasks etc.
Summarise the Course
At survey time you could summarise the course to date.
e.g. In week one we… in week 3 we found such and such a concept difficult…, in week 6 we … and finally, reiterate where you’re heading in the remainder of the semester. Helen’s recent post on ‘Going with the flow’ provides a model for this kind of activity. You might want to highlight how have you listened to your students (current and previous) and adjusted your learning and teaching plan.
Set aside time in class: (or in an online space)
Delivery in class time might be tricky, but in terms of getting response rates up it is well worth the time investment. It will show generosity if you allow class time, and that in itself emphasises that you take the process seriously, and are listening. It has been reported that 84% of Australians now have a smart phone, and 71% of those also have a tablet (Horizon Report 2014), and surveys can be completed on these devices online. If these stats seem inflated for your cohort, you could allow time for students to go to the library or build into an existing lab class.
Leave the Room:
It’s important that after all that, you leave the room. It highlights that the process is fair, provides thinking time, and creates a space for their comments to come to the fore.
At RMIT City Campus the Course Experience Survey (CES) survey is open to students from the 20th of September.
Ballantyne, C. (1999). Improving university teaching: Responding to feedback from students. In Zepke, N., Knight, M., L&ach, L. and Viskovic, A. (Eds), Adult Learning Cultures: Challenges and Choices in times of change, WP Press, Wellington, pp.55-165.
Johnson, T. (2002). ‘Online Student Ratings: Will Students Respond?’ Paper presented at the annual conference of the American Educational Research Association, New Orleans, 2002.
Share your thoughts on strategies to increase survey responses rates in the comments section!