As we near closer to the end of the semester, faculties are most likely looking to survey their students through Student Evaluation Surveys (SES, or sometimes Course Evaluation Surveys, CES). This, interestingly for our times, is now being done online. Surveying our cohorts is completed in ritualistic fashion, and often is used as a proxy for decision-making on teaching and learning quality, valuable data on student opinion and perspective, as well as evaluations of the online and offline offering.
Many educators have been sceptical of this surveying approach due to issues on validity and reliability: what attitudes are these surveys looking to capture? How are they getting a representative sample of students? Is it just capturing of a polarised opinion? And – most importantly – how is this data being used to evaluate teaching performance? This also may have critical importance on teams of sessional or adjunct staff, who are wedding to results of these surveys for continuing employment, but also as a vague metric of performance.
Well, we’re here to remind you of two bits of important science that truly dispel the myth that these surveys are relevant or robust!
A UK study of Russell group institutions from 2019 showed that sample size is important (Holland, 2019); any SES with fewer than 20 respondents should be reconsidered (as opposed to a “proportion of the class” measure of reliability). Larger classes have a bias towards overall lower scores on average, due to lower means, as larger cohorts are less likely to agree. However, they do provide more rich insight due to the mandate of size.
It is also very important not to dismiss patterns when summarising SES results, as this study suggests that the variability of responses is what matters. It is best to consider individual cohort differences, such as whether the module is an elective, prior discipline of study, as well as gender (of both learner and teacher). These individual qualifiers, if taken into account and able to filter, lead to huge variances in the insights we can glean from our SES datasets.
The second is more widespread then we realise: attempting to influence students’ positive implications of their subject and teaching quality through gifts (Hessler et al., 2018). Namely; cookies! As expected, the cohort that were given cookies at the time of completing their SES rating the course much more positively than a control group. Furthermore, teaching evaluation was higher also! Anecdotally, this may be no surprise to the academics inclined to bring sweets to their final class, with a reminder for students to complete feedback immediately.
In psychological terms, we call this a recency bias; where students have a more favourable perception or memory of the class due to a more enjoyable recent encounter or experience. It’s the similar effect of having a delicious final dessert dish in a degustation menu -ensuring the diner goes home happy and raving to friends and family.
Of course, the biggest issue with these surveys: the outcomes and changes as a result of such feedback really only impact future cohorts. Many students feel disenfranchised by how late they are being surveyed in the term.
Educators everywhere should remind students of the value of the feedback they are providing to improve learning and teaching, but also make efforts to address at the beginning of semester how the course has evolved from past versions. We have a duty to show dynamism and responsiveness to our learners. As teachers, this is to improve our practice transparently. As researchers, we can show this is also informed by a reliable evidence base.
Hessler, M., Pöpping, D. M., Hollstein, H., Ohlenburg, H., Arnemann, P. H., Massoth, C., … & Wenk, M. (2018). Availability of cookies during an academic course session affects evaluation of teaching. Medical Education, 52(10), 1064-1072.
Holland, E. P. (2019). Making sense of module feedback: accounting for individual behaviours in student evaluations of teaching. Assessment & Evaluation in Higher Education, 44(6), 961-972.
More of our thinking
After the events of 2020 and 2021, there is newfound respect, admiration and awareness of the work done by educators. Teachers have been the backbone of higher education institutions for many years, but they have found a new relevance in supporting students through the post-pandemic landscape.
Learning Experience Lead, Tom Whitford, reminds us why student evaluation survey results shouldn’t be taken at face value. This article highlights key variables which can distort student feedback findings and results, one being supplying sweet treats during the survey! Have a read to find out more.
In 2020, we interviewed UK students to find out their perspectives on the changes to their educational offerings in response to COVID-19 and discuss what institutions need to do to going forward. From listening to these students and reflecting on their views, we have put together some key takeaways and suggestions universities should take on board as they start to prepare for Semester 2.
With the rise of online learning, the demand for expert online facilitators has become significant. Curio Academy understands what it takes to be an expert in online facilitation and encourages aspiring and experienced online facilitators to refine their skills to stay current within the innovating education sector.
The post-pandemic economic and humanitarian recovery gives all stakeholders a unique moment to reshape the VET sector and strengthen VET’s potential to ensure Australian businesses of all shapes and sizes have the skills they need to support their growth.
World class educational thinking, trends and more.
Educate Futures, the podcast where Curio explores innovations in learning and teaching.
CEO, David Bowser’s reflections of what we have learnt over the past 5 years.
Use this template board to place content decisions in the context of an over-arching service design, informed by the desired business outcomes that drive content creation and the people, resources and technology available to support.
Our Digital Director Michael Frantzis discusses a past project of his, redesigning University College London’s homepage with Independent Design Director, Will Kruger and UX Consultant Marcos Villasenor. The project involved extensive user testing to iteratively improve the page design and refine content. The result was a beautiful, user-friendly landing homepage that saw some impressive results.
Don’t fall for a mismatch in learning and development resources allocated to leadership and corporate functions. Involve all departments in change management at the outset to plan effective learning and development initiatives with a thorough understanding of capability and skills gaps, alongside business priorities.