< Back to Our thinking

Why Student Evaluation is Broken: Using Cookies and Science

Posted by

As we near closer to the end of the semester, faculties are most likely looking to survey their students through Student Evaluation Surveys (SES, or sometimes Course Evaluation Surveys, CES). This, interestingly for our times, is now being done online. Surveying our cohorts is completed in ritualistic fashion, and often is used as a proxy for decision-making on teaching and learning quality, valuable data on student opinion and perspective, as well as evaluations of the online and offline offering.

Many educators have been sceptical of this surveying approach due to issues on validity and reliability: what attitudes are these surveys looking to capture? How are they getting a representative sample of students? Is it just capturing of a polarised opinion? And – most importantly – how is this data being used to evaluate teaching performance? This also may have critical importance on teams of sessional or adjunct staff, who are wedding to results of these surveys for continuing employment, but also as a vague metric of performance.

Well, we’re here to remind you of two bits of important science that truly dispel the myth that these surveys are relevant or robust!

A UK study of Russell group institutions from 2019 showed that sample size is important (Holland, 2019); any SES with fewer than 20 respondents should be reconsidered (as opposed to a “proportion of the class” measure of reliability). Larger classes have a bias towards overall lower scores on average, due to lower means, as larger cohorts are less likely to agree. However, they do provide more rich insight due to the mandate of size.

It is also very important not to dismiss patterns when summarising SES results, as this study suggests that the variability of responses is what matters. It is best to consider individual cohort differences, such as whether the module is an elective, prior discipline of study, as well as gender (of both learner and teacher). These individual qualifiers, if taken into account and able to filter, lead to huge variances in the insights we can glean from our SES datasets.

The second is more widespread then we realise: attempting to influence students’ positive implications of their subject and teaching quality through gifts (Hessler et al., 2018). Namely; cookies! As expected, the cohort that were given cookies at the time of completing their SES rating the course much more positively than a control group. Furthermore, teaching evaluation was higher also! Anecdotally, this may be no surprise to the academics inclined to bring sweets to their final class, with a reminder for students to complete feedback immediately.

In psychological terms, we call this a recency bias; where students have a more favourable perception or memory of the class due to a more enjoyable recent encounter or experience. It’s the similar effect of having a delicious final dessert dish in a degustation menu -ensuring the diner goes home happy and raving to friends and family.

Of course, the biggest issue with these surveys: the outcomes and changes as a result of such feedback really only impact future cohorts. Many students feel disenfranchised by how late they are being surveyed in the term.

Educators everywhere should remind students of the value of the feedback they are providing to improve learning and teaching, but also make efforts to address at the beginning of semester how the course has evolved from past versions. We have a duty to show dynamism and responsiveness to our learners. As teachers, this is to improve our practice transparently. As researchers, we can show this is also informed by a reliable evidence base.

Works cited 

Hessler, M., Pöpping, D. M., Hollstein, H., Ohlenburg, H., Arnemann, P. H., Massoth, C., … & Wenk, M. (2018). Availability of cookies during an academic course session affects evaluation of teaching. Medical Education, 52(10), 1064-1072.

Holland, E. P. (2019). Making sense of module feedback: accounting for individual behaviours in student evaluations of teaching. Assessment & Evaluation in Higher Education, 44(6), 961-972.

about the writer

David Bowser

CEO

David has over 20 years’ experience in education, R&D and financial services both as a neuroscientist (Melbourne and Cambridge Universities for 14 years) and leading strategy consultant where he was retained by CEOs, government departments and their executive teams to advise on their most complex strategic issues (Nous Group, Principal and Education sector leader for six years). In 2016 he created Curio, a collective of advisors, educators and product developers focused on working directly with education and training organisations to improve human learning. Over the last five years he has led Curio from Melbourne on its amazing growth journey – annually doubling in size to now over 230 people in offices across Australia and United Kingdom. Learn more about David
Talk to David about:
  • Data, analytics and learning
  • Digital strategy and architecture
  • Facilitation and consultation
  • Marketing and sales
  • People and capability
  • Strategy
Contact David on:

P: +61 488042818

LinkedIn

More of our thinking

Building a university homepage using an iterative UX design process

Our Digital Director Michael Frantzis discusses a past project of his, redesigning University College London’s homepage with Independent Design Director, Will Kruger and UX Consultant Marcos Villasenor. The project involved extensive user testing to iteratively improve the page design and refine content. The result was a beautiful, user-friendly landing homepage that saw some impressive results.

The importance of high-quality online facilitators

With the rise of online learning, the demand for expert online facilitators has become significant. Curio Academy understands what it takes to be an expert in online facilitation and encourages aspiring and experienced online facilitators to refine their skills to stay current within the innovating education sector.

Unlocking Digital Transformation: Insights from Education Experts

In a dynamic panel session at Digifest 2024 in Birmingham, the participants discussed the concept of “frictionless learning” and its implications for education in the digital age. The panelists included Michael Frantzis from Curio, Neil Stapleton from Cambridge Judge Business School, Julia Leong Son from University of London Worlwide, and David White from University Arts Unlocking Digital Transformation: Insights from Education Experts

Skills Passport Curio Group Submission

  The National Skills Passport stands as a transformative tool that could significantly streamline the recognition of qualifications and credentials, serving as a bridge between learners’ achievements and employers’ needs. By facilitating a more efficient verification process, the Skills Passport promises to reduce the administrative burden and costs associated with credential validation.    Key Principle Skills Passport Curio Group Submission

“Now, it’s a lot harder” : Unpacking the student perspective on the recent lockdown of UK campuses

In 2020, we interviewed UK students to find out their perspectives on the changes to their educational offerings in response to COVID-19 and discuss what institutions need to do to going forward. From listening to these students and reflecting on their views, we have put together some key takeaways and suggestions universities should take on board as they start to prepare for Semester 2.

Charting a Course for Australia’s Future: The Universities Accord

  The Australian Universities Accord seeks to be a game-changer for the nation’s future, signalling a monumental leap in higher education strategy and policy not seen since the Bradley Review. It’s an ambitious playbook for improving Australia’s economic, social, and green credentials. Here’s the Curio summary:    By 2050, we’re talking about increasing the number Charting a Course for Australia’s Future: The Universities Accord

Why Student Evaluation is Broken: Using Cookies and Science

Learning Experience Lead, Tom Whitford, reminds us why student evaluation survey results shouldn’t be taken at face value. This article highlights key variables which can distort student feedback findings and results, one being supplying sweet treats during the survey! Have a read to find out more.