Showing posts with label student satisfaction. Show all posts
Showing posts with label student satisfaction. Show all posts

Thursday, 9 June 2016

Counting the cost

HEPI published today the outcomes of the annual Student and Academic Experience Survey, which they conduct jointly with the Higher Education Academy. It’s a really interesting survey, which allows comparison with previous years while also adding content to address current policy issues. You can read the survey here.

One issue which features prominently in the survey is student perceptions of value for money. There’s plenty in the survey about whether students think they’re getting value. Most don’t, it seems, but my experience is that the value of a university education stays with you for a long time – it isn’t a good which you consume, it’s an investment which stays with you. We’ll get more relevant data if we ask these students the same question in 40 years.

It takes a lot of these to make a building
All well and good, or not. But there’s another aspect of this question which I find fascinating. The survey asked students to identify their preference for what universities should spend less money on, in order to be able to reduce the cost. There were two clear favourites here: Spending less on buildings (49% of respondents) and spending less on sport and social facilities (46%).

This is a genuinely hard challenge for universities to address, for three reasons.

Firstly, let’s look at the money. Buildings last a long time, and money spent on buildings is regarded as being spent over the lifetime of the building. It’s not unusual to see the value of a building spread over 50 years. So the value in the accounts in any one year of a £50m new build is £1m. (If you use straight-line depreciation. Ask an accountant.)

Another way of looking at it is the cost of borrowing the money. Universities get good interest rates at the moment – 4% would be at the higher end. So the loan for the £50m building would cost about £3.8m per year, if we assume repayment over twenty years.

Now neither of these numbers - £1m per year, or even £3.8m per year – is going to make a big dent in a university’s cost structure. Even if a university was minded to reduce fees (and there’s lots of reputational reasons why they shouldn’t, and market data from students that shows that there’s no point), it wouldn’t make much impact in a university big enough to need a £50m building.

But, you say, perhaps universities don’t need the shining edifices of glass and metal which spring up all over campuses. What if they had plainer buildings? What indeed.

Few university buildings are truly high specification, in terms of the extravagance of the fixtures and fittings. But for the sake of argument let’s assume that the £50m building would cost £40m if it wasn’t so luxuriously specified, or if space was better used (now that’s more likely to be an issue, I will concede). This means that the university spends less money, but – for exactly the same reasons as I set out above, not as much as would be needed to make a dent in fees. An extra £200k - £700k per year – a 20% reduction on the cost of the building – wouldn’t mean a University could do much about fees.

And thirdly, the buildings are needed. The reason university campuses are building sites today is that many university buildings had become worn out and no longer functional. Without investment to renew and replace, campuses would become progressively less welcoming, and the impact on students and staff would be serious.

But all of these don’t add up to a good reason to ignore what students are saying: regardless of the reality of the costs of running a university, students don’t share the management’s perceptions about how to spend (their!) money. I think that the survey tells us another approach.

The survey shows that only 18% of respondents think that they definitely or maybe are given enough information about how fees are spent. But interestingly, 21% first years (who will have seen universities’ response to CMA strictures about information) feel that they have had enough information. If you know what the money’s spent on, it’s easier to come to a more reasonable conclusion about the proportionality of the cost.

It’s a baby step, but it’s a significant one (significant at 99% confidence, according to the report footnote). If universities want to address student perceptions of value, one thing they could do much more of is tell students where the money goes.

Thursday, 20 August 2015

More fun with the NSS

The NSS data - about which I posted earlier this week - allows you to compare specific universities. I've analysed the data to show how the different university mission groups compare over the last six years.

For each mission group I've calculated the mean satisfaction score for each year across the mission group's members, unweighted by the number of students at each institution. The 'satisfaction score' is the proportion of students who agreed with the statement 'Overall, I am satisfied with the quality of my course' (Question 22 of the NSS.)  The mission group membership for each year is that at August 2015 - that is, I haven't taken account of historic changes, such as the addition of four members to the Russell Group in 2012.

Although there are clearly differences between the mission groups, with sustained differences over time, its important to recognise that the data show a large amount of satisfaction whatever the mission group - scores range in 2015 from 85% satisfaction to 89% satisfaction. Although every Vice-Chancellor will say that there's room for improvement, its a good solid performance.

I've also been a little mischievous - the 1994 Group folded a few years ago but I've resurrected it for this comparison. It was formed of research-intensive universities, like the Russell Group, but they tended to be smaller and, as their informal strap-line had it, 'elite but not elitist'.  Most of the former 1994 Group members are no longer in a mission group, but the pattern of performance shows that mission group alone should not be taken as a guide to a university's performance.

Do the data necessarily mean that teaching and the student experience are better at the Russell Group? Not necessarily. Remember - the data show student satisfaction, which might be higher at Russell Group universities for other reasons. Maybe Russell Group universities are better at managing expectations; maybe the students at Russell Group have more realistic expectations of university because their family has a history of going to university already - they have the social capital to know what to expect and to make the most of it. But of course it is also possible that students at Russell Group universities just do have a better experience ...

Monday, 17 August 2015

If you're happy and you know it ...

The results of the 2015 National Student Survey (NSS) were published last week. The NSS asks all final year undergraduates in the UK to rate their satisfaction with various elements of their study. It’s a survey which has had plenty of criticism over time in relation to its usefulness, but it is here to stay.

The survey includes a question asking students to rate their overall satisfaction, and it is this that often generates the headlines. I’ve compiled a data set going back to 2010 for all UK universities, showing their performance on this question. The data shows the proportion of respondents who definitely or mostly agree with the statement ‘Overall, I am satisfied with the quality of my course’.

The overall trend is upwards: in 2010 the institutional average (mean) score was 81.8%; in 2015 it was 85.9%, and it has risen in every year. This might be because students are generally more satisfied; in my view it also reflects greater effort by universities to manage student expectations, to address issues, and to encourage higher rates of survey completion. (A good general rule is that the more cross someone is, their more likely they are to give feedback – so the more responses you get, the better on average the response will be).

You can see some interesting patterns when you look at the four nations in the UK. England dominates the UK sector in numbers, so it’s unsurprising that the England pattern is like the UK average. But the three other nations – Scotland, Wales, and Northern Ireland – show distinct patterns. Northern Ireland in particular performs well, and although this is only a small number of universities, it is a consistently higher response.

Headlines often focus, understandably, on who has the most satisfied students, but there’s another measure possible. A benchmark is calculated for each university, reflecting the sector average satisfaction levels but adjusted to reflect the mix of students at the institution. And where performance above (or below) benchmark is statistically significant, this is flagged. And so we can see which universities consistently score better than the data says they ought. These are places which, if the survey is to be believed, understand how to ensure that their students are satisfied.

I’ve identified those universities which – in every year from 2010 to 2015 – perform significantly better than their benchmark for overall student satisfaction. Here they are – in alphabetical order. This would be my go-to list if I wanted to understand how to make NSS results better:

University of Buckingham
Conservatoire for Dance and Drama
University of East Anglia
University of Essex
University of Keele
Loughborough University
University of Newcastle upon Tyne

Well done those universities.

Wednesday, 24 June 2015

Value for money?

Earlier this week the BBC headlined a survey conducted by ComRes on their behalf which asked final year undergraduates three questions about their university experiences.  The headline figures were:

1 Which of the following comes closest to your opinion about your university education?
It has been value for money
521
52%
It has not been value for money
398
40%
Don't know
82
8%

2 To what extent do you feel that university has prepared you for the future?
A great deal
256
26%
Somewhat
576
58%
Not really
136
14%
Not at all
29
3%
Don't know
4
0%


If you could start university again, which of the following do you think you would do?
I would take the same course at the same University
458
46%
I would take the same course at a different university
181
18%
I would take a different course at the same university
166
17%
I would take a different course at a different university
117
12%
I would not go to university at all
34
3%
Don't know
46
5%


The headlines were clear – 40% of final year students didn’t think that their university education had been value for money. And with this being the first cohort to have paid £9k per year fees, that’s quite a story.

The sector response – if that is what a quote from the Chief Executive of Universities UK amounts to – was to defend universities’ record, drawing on the National Student Survey results.  Nicola Dandridge was quoted by the BBC as saying “The last national student survey reported that 86% of students were satisfied overall with their course. It shows that universities across the UK are responding to student feedback and working hard to improve the academic experience.”

The 2015 NSS results are set to be published by HEFCE on 12 August, and it’s a fair bet that many universities will be watching carefully not only because of their significance for league tables, but to see what effect the £9k fees have on student responses.

It seems very likely that there will be an effect: the ComRes survey broke down answers by subject of study and by region of university, and in Scotland, which does not charge fees to Scottish students, fully 79% said that their university education had been value for money, compared with 52% across all responses.

But there’s other interesting data too.  The third question is revealing: if they had their time again, only 3% of respondents would not go to university at all.  63% would go to the same university, with about a quarter of these opting for a different subject. 64% would choose the same subject, mostly at the same university.

Same university
Different university
Same course
46%
18%
Different course
17%
12%

This tells me quite a different story – fewer than half of students made the right university/course choice; but most got at least one of the variables right.  So, much to be done on advice and guidance at application, but less of a panic, it seems to me, about the perceived value of higher education.  

Maybe the need to get this right will push post-qualification application back up the agenda.  If universities focused on the needs of students and learners rather than their own staff convenience, students might make better university and course choices, and so be happier. Just a thought …