Tuesday, 27 October 2015

It's not all about the TEF, you know

There’s been much speculation about the details of the Teaching Excellence Framework, but the forthcoming green paper on Higher Education will contain much more than that. A discussion yesterday at the AUA Partnerships network conference prompts me to take a closer look at one aspect of this.

Jo Johnson’s speech to UUK on 9 September contained the following:
"The green paper will cast a critical eye over the processes for awarding access to student support funding, Degree Awarding Powers and University Title.
We have already made a start by providing a new route for trusted new and smaller providers to grow their student numbers. We are also beginning to link student number controls to the quality of the provider, through a “performance pool” which will operate for 2016 to 2017.
But the green paper will consult on options to go further. Success in higher education should be based on merit, not on incumbency. I want to fulfil our aim of a level playing field for all providers of higher education.
Many of you validate degree courses at alternative providers. Many choose not to do so. I know some validation relationships work well, but the requirement for new providers to seek out a suitable validating body from amongst the pool of incumbents is quite frankly anti-competitive. It’s akin to Byron Burger having to ask permission of McDonald’s to open up a new restaurant.
It stifles competition, innovation and student choice, which is why we will consult on alternative options for new providers if they do not want to go down the current validation route."
The basic point here is that without the power to award degrees, it’s tricky to be a provider of higher education. The practice of validation – a university agreeing that a curriculum and education designed and delivered by another organisation can lead to the award of a degree from the university – is a solution to this. (Note that validation is different from franchising – under franchising, the university owns the curriculum but outsources the teaching.)

There are 75 UK universities which are members of the Council of Validating Universities (CVU), which gives an idea of the scale of the practice. There isn’t (yet) a single definitive register of validated providers, but research conducted for BIS in 2013 counted 674 privately funded HE providers, most of which will have a validation relationship with one of more universities. So the ministers claim that “Many of you [remember, he was talking to Vice-Chancellors] validate degree courses at alternative providers” looks true. And it can be a decent business for universities, helping a faculty to balance its books.

The underlying cause of the minister’s ire – the need to find a way to empower new colleges in times of expanding higher education - isn’t new. The University of London fulfilled this function, via its external degree, in the nineteenth and first half of the twentieth century: many UK and overseas universities can trace their origins back to colleges offering tuition for the London external degree. The Council for National Academic Awards (CNAA) fulfilled this function for the polytechnic sector until it was wound up in 1993, after the polytechnics had been made into universities.

Can we tell what approach might be in the green paper? If we take the CNAA model, then the establishment of a new body to do this could be possible. But the creation of a new quango at a time when they are set to be culled seems counter to the spirit of the times (O tempora! O mores!). So the University of London model is the other alternative from history: perhaps designating an existing university as having a 'duty to validate', or creating another university which is only a validator?

There’s a moral hazard here, just as the minister perceives an anti-competitive hazard in current arrangements. The key to validation is the maintenance of academic standards, and if you’ve a duty to validate, then an important element of the validation relationship – that of judging the capacity of another institution to meet the right standards – is put at risk. Some of the outcomes of the QAA’s review of alternative providers show the problems here: some alternative providers are good, but there are also some very shoddy ones.

Is the minister’s argument rooted in specific concerns? It would interesting to know which alternative providers have complained, and which university has refused to validate. What were the specifics? Was the refusal justified? Or perhaps the validation fee was simply seen as too high? Unless we can see that it’s a market that is broken and can’t be fixed by regulation, then the creation of a new entity seems premature.

I’ll be interested to see what the green paper has to propose.

Friday, 9 October 2015

Governance in Scotland

Interesting times in Scottish educational governance, with the news that the Education Secretary in the Scottish Government, Angela Constance, has used powers granted to her by the Post-16 Education (Scotland) Act 2013 to dismiss the Chair and Board of Governors at Glasgow Clyde College.

A Scottish chair
The action was taken, according to reports, on the grounds that the College's governing body had failed in a  number of ways - its "relationship with students had broken down" and it had "breached rules on spending public money". There's a back story which includes the suspension of the Principal: the waters look pretty murky to me.

According to the Scottish Government's informal guidance on the 2013 Act,
"The grounds for removal [of a Board] are where it appears to Ministers that a board: 
i) has committed or is committing a serious one-off breach of any term or condition of grant made to it by, in the case of a regional college, the Scottish Further and Higher Education Funding Council (SFC) [or by, in the case of an assigned college which is an incorporated college, its regional strategic body]; 
ii) has committed or is committing repeated breaches of such terms or conditions; 
iii) has failed or is failing to provide or secure the provision of education of such standard as the Ministers consider to be appropriate; 
iv) has failed or is failing to discharge any of their duties properly; or 
v) has mismanaged or is mismanaging its financial or other affairs."
which, given the press reports, seems to imply that maintaining a good relationship with the students is either part of educational standards (ii above) or a duty of a board of governors (iv above). The alleged breaches of financial procedures and proper governance of meetings would have been grounds under (i) for dismissing the board, so it does seem that there's a bigger point being made here.

What's interesting from a higher education perspective is the example of how the Scottish government chooses to use its powers of intervention. Currently under discussion in Scotland is legislation which would change the way in which university governance worked, including provisions for staff and student representation, and the election of chairs. And this is not without controversy.

So Scottish Universities now have an example before them on which they can reflect.

Tuesday, 6 October 2015

Making the Grade

Rensis Likert, inventor of the Likert scale
I’ve been looking at the National Student Survey (NSS) rankings of Students’ Unions recently (and more of that another time) but one thing which can’t wait is the question of how you actually rank the outcomes.

The NSS includes a question asking respondents to rate their response to the statement “I am satisfied with the Students' Union at my institution”, where a 1 corresponds to Strongly Disagree, a 5 to Strongly Agree, and 2-4 are the points in between, with 3 being neutral. It’s a five-item Likert scale of the sort very commonly used in surveys.

The NSS data is used in the Key Information Set, with the score for a Students’ Union being calculated as the sum of the Agree and Strongly Agree ratings.  For example, a students’ union with the following outcomes in NSS Q24:


1
2
3
4
5
Poppleton University
3%
15%
10%
45%
27%

would be ranked as 72% on KIS (45% + 27% = 72%).

But there are different combinations of results which would sum to 72% positive: to take a more extreme case:


1
2
3
4
5
Poppleton Metropolitan University
23%
5%
0%
55%
17%

This also has a KIS rating of 72%, but it doesn’t look like the same sort of response at all: almost a quarter of respondents are really unsatisfied with Poppleton Metropolitan SU, as opposed to the handful at Poppleton SU.

A way round this problem is to use a Grade-Point Average (GPA) measure. The combines in one number the proportion of respondents at each level – so, for instance, Poppleton University SU’s GPA would be 3% of 1 plus 15% of 2 plus 10% of 3 plus 45% of 4 plus 27% of 5, which works out at a GPA of 3.78.  And this GPA of 3.78 means that the average respondent fell between ‘Neither agree nor disagree’ (3) and ‘Agree’ (4), and was nearer to agree than to the neutral position.

Compare this with Poppleton Metropolitan SU, which has a GPA of 3.38. Again, this falls between ‘Neither agree nor disagree’ and ‘Agree’, but is closer to neutral than to a positive endorsement. So both SU’s have a positive GPA, but it’s clearer that there’s also a difference in how students perceive each of them. 

“OK Hugh,” I hear the more polite ones amongst you say, “thanks for the statistics lesson. But so what?” Always a good question. Let’s have a look at the difference it makes in practice.

Here’s the top ten Students’ Unions, in NSS 2015, using the KIS measure and the GPA:


KIS Measure
GPA Measure
1
The University of Sheffield
The University of Sheffield
2
St Mary's University College
Loughborough University
3
Loughborough University
St Mary's University College
4
The University of Leeds
The University of Leeds
5
University of Dundee
University of Dundee
6
Royal Northern College of Music
Cardiff University
7
Cardiff University
Royal Welsh College of Music and Drama
8
Teesside University
Royal Northern College of Music
9
Royal Welsh College of Music and Drama
The University of Keele
10
The University of Keele
Teesside University

These are the same ten universities, but in a different order, as Eric Morecambe might have said. So perhaps it isn’t the end the world what measure we use. What about the bottom ten?


KIS Measure
GPA Measure
150
The University of Westminster
University for the Creative Arts
151
Oxford Brookes University
Ravensbourne
152
Queen Margaret University, Edinburgh
Oxford Brookes University
153
Liverpool Institute for Performing Arts
Queen Margaret University, Edinburgh
154
University of the Highlands and Islands
Liverpool Institute for Performing Arts
155
Royal Conservatoire of Scotland
Royal Conservatoire of Scotland
156
University of Durham
University of Durham
157
University of Bristol
University of Oxford
158
University of Oxford
University of Cambridge
159
University of Cambridge
University of Bristol

It isn’t the same ten universities on both sides here: the measure makes a difference for the University of the Highlands and Islands, Ravensbourne, the University of Westminster, and the University of the Creative Arts.

And there are some very stark differences if you look within the rankings. Let’s have a look at the top ten Students’ Unions for differential GPA-versus-KIS-ranking-performance (take a deep breath and say it slowly - it’s only a matter of time before this becomes a standard measure, you know …):


GPA Rank
KIS Rank
Difference
The University of Law
66
111
45
The Open University
69
112
43
Heythrop College
119
86
33
Hull and York Medical School
38
66
28
Birkbeck College
113
137
24
University of Sussex
71
49
22
Bath Spa University
110
89
21
University of the West of Scotland
98
118
20
Royal Holloway, University of London
111
91
20
The University of Liverpool
115
95
20

In five of these cases the KIS ranking is better, in five the GPA ranking.  Let’s have a look at the numbers in detail:


1
2
3
4
5
GPA
KIS
The University of Law
4%
4%
29%
20%
42%
3.89
62%
The Open University
2%
2%
35%
32%
30%
3.89
62%
Heythrop College
8%
13%
13%
44%
23%
3.64
67%
Hull and York Medical School
1%
2%
27%
33%
37%
4.03
70%
Birkbeck College
5%
4%
34%
31%
26%
3.69
57%
University of Sussex
4%
6%
16%
41%
32%
3.88
73%
Bath Spa University
6%
9%
18%
42%
25%
3.71
67%
University of the West of Scotland
5%
8%
27%
30%
31%
3.77
61%
Royal Holloway, University of London
6%
9%
20%
43%
23%
3.71
66%
The University of Liverpool
5%
8%
21%
42%
23%
3.67
65%

Broadly speaking, if the GPA rank is higher than the KIS rank (Law, Open, Hull-York, Birkbeck, West of Scotland), it points to a more even distribution of scores across the five categories, if KIS is higher than GPA (Heythrop, Sussex, Bath Spa, Royal Holloway, Liverpool), then it points to more extreme scores in 1 and 2. Broadly speaking.

And that’s the key to it. The KIS ranking answers a simple question: how many like it.  The GPA ranking is more nuanced, but also requires more interpretation. 

That gets to the heart of the ranking and league table problem. Rankings and league tables are a way of simplifying what is complex. There’s often a direct trade-off between simple and realistic. And that’s why it matters what methodology a league table uses, but why it is also inevitable that there’s game playing within the rankings. 

So be careful always to read a ranking or a league table having taken a suitably sized pinch of salt.