If governments and educational institutions are serious about differentiation, market segmentation and strategic marketing they should be wary of headline grabbing boasts driven by shallow or questionable research. It’s been a recurring and growing occurrence in recent years but is unlikely to lead to the type of self-analysis and improvement that will build competitive advantage. There are plenty of examples but a small sample relating to international students and taken from the four main recruiting countries is sufficient to show the problem.
A recent PIE article trumpeted the ‘overwhelming satisfaction’ that international student have with their experience in US higher education. The World Education Services (WES) survey, ‘Are US HEIs meeting the needs of international students?’ asserts that 91% of respondents are ‘overwhelmingly satisfied with their experience studying in the U.S’. But the report itself makes the point that the survey findings ‘may not be generalizable’ to the US international student population and may suffer from ‘self-selection and sample biases.’ There’s certainly plenty to question about how representative a sample of 1,921 self-selecting students can be.
This outcome has similarities to the recent UUKi Graduate Outcome Survey 2019 carrying the line that over 90% of graduates who studied in the UK were ‘satisfied or very satisfied with all aspects of their lives’ (UUKi Graduate Outcome Survey 2019). As noted in a previous blog the UUKi Survey is flawed for reasons that are as uncomfortable in terms of the ways universities engage with alumni. Only 6% of the total respondents were from China and, as a footnote confirms, “in the year 17-18, Chinese students made up 33% of the total non-EU student population…”.
Looking further afield Canada’s 2018 CBIE Survey indicated that ‘93% of students stated that they are either satisfied..or very satisfied’ with their experience. The sample size of 14,228 is noted as 4% of total post-secondary students in Canada so still a relatively small group. And the percentage from south and east Asia was only 45% compared to at least 70% of Canada’s international students coming from those regions.
Of the big four recruiting countries Australia’s DET 2018 International Student survey saw an impressive 27% response rate from international students. The outcome was that 89% ‘were satisfied or very satisfied with their living and learning experience in Australia’. Regrettably, there is no access to underlying data to determine how representative the sample is of the international population.
In their publicity material, however, Australia makes claims about its performance in comparison to others across the world. What is peculiar about these claims is that the margins are wafer thin with, for example, ‘satisfaction with learning’ showing as Australia 88.5% Other Countries 87.5%. And the comparison source is shown as *International Student Barometer (incorporating scores from hosting countries including USA, Canada, UK and New Zealand). The obvious question is – who else does it include?
Readers who are concentrating will have notice an interesting echo across all of the results – 91% (WES), over 90% (UUKI), 93% (CBIE) and 89% (DET). This suggests that there may be a self-fulfilling nature to these surveys with the international students who take part simply more likely to be satisfied. Those who found the experience less helpful may just be looking to get on with their lives after a poor experience that has left them struggling to find graduate level employment.
It’s a reminder of the minor marketing furore in the UK, where a well-known advertising slogan for cat food Whiskas was “eight out of ten owners said their cat prefers it”. After a complaint to the Advertising Standards Authority, this was changed to “eight out of ten owners who expressed a preference said their cat prefers it”. Perhaps student surveys should come with similar, upfront cautions about their relevance, authority and comprehensiveness.
Another manifestation of the problem is the tendency to cherry-pick data, draw misleading comparisons or ignore longer-term trends in the pursuit of self-congratulatory platitudes. This can happen with students surveys or enrollment counts. But it’s all part of the bland, self-congratulatory spin.
Nicola Dandridge, chief executive of the Office for Students, welcomed the UK’s 2019 National Student Survey with the words: ‘It is good news that overall satisfaction with higher education courses remains high this year.” The full statement recorded that satisfaction had risen to 84% from 83% the year before. No mention of the fact that in 2014 and 2015 the satisfaction rate was 86% and, on that measure, fewer students are satisfied than five years ago.
Some might argue that when the survey first came out in 2005 the overall satisfaction rate was only 81.3% and so there has been an improvement over the longer timescale. A reasonable response to that would be that universities are full of academics who are good at passing tests and that the Survey has been ‘gamed’ so improvement was inevitable. Institutions quickly worked out how to optimize response rates and manage academic behavior in ways that improved their rankings.
For those interested in more reading on the NSS, The Economics Network has done a really nice analysis of results across a number of dimensions, subjects and sector groups. As an example, the Russell Group of universities has, since 2015, seen a precipitous fall in positive responses to the statement, ‘Assessment arrangements and marking have been fair’. It fairs no better on overall satisfaction with scores of 86.5% in 2010 falling to just over 81% in 2019.
Finally, and as an example where enrollment data can be interpreted in ways that distort more worrying trends, there is official reaction to the latest Open Doors press release. As mentioned in several blogs and most recently in December it’s difficult to accept the headline that ‘Number of International Students in the United States Hits All-Time High’ with anything more than a sigh. Including OPT students who are doing post-study work and not directly contributing to universities either financially or academically seems an almost deliberate attempt to draw attention away from two years of decline in those enrolled.
It is reasonable to believe that the surveys mentioned and the individuals quoted are well intentioned, but the best organizations are obsessed with using research to find out what can be improved, and they realize the difference between the PR ‘puff’ and game-changing insights. Higher education decision makers need to be more demanding of student surveys and focus their thinking on students who are unhappy or who are not trying the product at all. They might also care to look harder at whether graduates are finding their degree has genuinely opened up better options, or whether they are benefiting from the ‘aftercare’ service implicit in alumni relations promises.
Colleagues elsewhere in the sector have also commented extensively on the ‘survey fatigue’ that is reducing response rates and undermining credibility. But there should be equal concern about the self-interest of organizations that accept the status quo even when it is manifestly inadequate. Far better to follow the line that made Bill Gates one of the richest people in the world – “Your most unhappy customers are your greatest source of learning.”
Image by Florian Bollmann from Pixabay