Showing posts with label KPIs. Show all posts
Showing posts with label KPIs. Show all posts

Friday, 3 July 2015

Employability

The release of the Employment Performance Indicators by HESA this week made interesting reading. Universities up and down the land will be celebrating or holding post mortems, more so as employability seems set to increase in prominence in the government's thinking about higher education.

It’s a large data-set making it hard to spot patterns. I thought it might be useful to look at the aggregate outcomes by type of university. That isn’t going by mission group, but looking at institutions in historical context:

  • The ancient universities – not just Oxbridge but also the Scottish ancient foundations
  • The redbrick universities – the civic foundations of the 19th and first half of 20th centuries
  • The CATs – the Colleges of Advanced Technology given university status in the 1950’s and the 1960’s
  • The plate-glass universities – the new creations of the 1960’s
  • The post 1992s - the former polytechnics which became universities en masse in 1992
  • Newer universities – those created after the 1992 transition, often from colleges of higher education or former teacher training institutions
  • Specialist institutions – arts, drama, agriculture, medical and so on

The HESA methodology changed in 2011-12, meaning that there’s only three years of comparable Employment PI data. This is how those seven categories aggregate:

Employment PI (%)
2011-12
2012-13
2013-14
Ancient
93.0
94.0
94.5
Redbrick
92.0
92.9
93.8
CAT
90.8
91.4
92.0
Plate-glass
92.3
93.2
94.4
Post 1992
89.0
91.0
92.4
Newer
91.4
92.5
93.4
Specialist
92.4
92.3
94.7

It’s all very close, but note one interesting feature: the types with the lower employment of graduates are the post-1992 universities and the former CATs. Both of these types of universities aimed, historically, to focus on programmes and skills which met the needs of business and industry.

Within the data, of course, are highs and lows in each category. And the outcomes for any institution will have lots of factors which determine the employment KPI– whether its region and the local economy or subject spread, or even very local issues about how the DeLHE survey was conducted.

Nevertheless, it’s sometimes good to take a step back from the data and see what might be going on.


Friday, 7 November 2014

Higher Education KPIs #2 – EBITDA

I posted a couple of weeks ago about staff costs as a proportion of income. Another performance indicator which has become common in Universities over the past few years is EBITDA.



No, not an obscure Scrabble word (although BAITED for 9 points is do-able with the same letters), but an acronym for a financial indicator, which helps you understand the underlying financial strength of a university.

Earnings
Before
Interest,
Tax,
Depreciation and
Amortization

There’s some jargon in there. Let’s unpack it a little.

Earnings is just what it sounds like. Total income minus total expenditure.

But, accounting isn’t as easy as looking at what’s in your wallet at the end of the day. It’s about keep track on real cost and values of assets and income, and recognising that there’s a difference between what you use once and once only and what keeps being usable. (So, I’ve just had a cup of coffee. The coffee granules are gone and used; the mug I can wash and use again.) And that‘s where some of the other terms come into play.

Interest means interest that is payable on debts owed. It’s a cost, for sure, but it’s a cost that can vary because of decisions taken by the borrower (how long do you borrow for? Fixed or variable interest rate? Secured or unsecured? Wonga or the Co-op?). So interest payments don’t necessarily tell you much about financial strength – but they might tell you a lot about the financial acumen of the management team.

Tax, sad to say, is similar. Not for universities – there isn’t often a university tax scandal (but see here for an exception!) – but remember that EBITDA is from the commercial world. And then think of Starbucks and Amazon and realise that tax, if you’re rich and powerful enough, is optional. So tax costs measure the skill of your accountant.

Depreciation and Amortization are both accounting concepts.

Depreciation is a way of measuring the value that’s left in an asset which is reusable. Imagine a whiteboard in a classroom. The lecturer can write on the whiteboard, get value out of it for teaching, and then wipe it clean at the end. It’s got value, but it hasn’t been used up by the class. But over time, it becomes less clean: people use the wrong marker pens; the shiny surface dulls with being cleaned too often. After a few years you need a new one.

This gives a problem if you want to measure how much the whiteboard costs for a particular class. If you put all of the cost against the first class in which it is used, that class seems expensive, and the continuing value of the whiteboard isn’t recognised. If you only charge the cost to the class where it finally needs a replacing, then again it isn’t right: the wear and tear occurred over years, not in one hour. And so you take the cost of the whiteboard and allocate it, a bit at a time, to the activities for which it is used.

Now in accounting this is done over time, so if the whiteboard cost £50 and is estimated to have a lifetime of five years, then you charge £10 per year, for five years, as the accounting cost. (You still need to pay up front, of course – as I said, accounting isn’t about what’s in your wallet but about true costs and values.) Critically, there is judgment involved in how depreciation works. And every university will have accounting policies which set out the rules of thumb applied.

Amortization is a similar concept, but refers to the cost of intangible assets. (A white board is tangible – you can see it, and if it drops off the wall onto your foot it will probably hurt. A university’s reputation is intangible: it can fall without hurting anyone straightaway: the pain take a while to appear.) Universities do have intangible assets – intellectual property, for instance. Again, there’s judgment involved in identifying how much specifically to allow for amortization.

(As a cheery footnote, the mort in amortization is the same as the mort in mortgage, and both come from the medieval French mort=death.)

So EBITDA is a measure of earnings which is free of financial and accounting wizardry. It’s a very Gradgrindian measure, for fans of Dickens. Or for fans of twentieth century politics, it’s like Sir Alec Douglas Home and his matchsticks.

For universities it has become a favourite of funders and regulators. HEFCE use it to decide whether a university needs to get permission to borrow (take a look at Annex C): if a university’s total financial commitments are more than five times its average EBITDA then written permission from HEFCE is required for the borrowing. And so it has a practical consequence, which helps to explain why university governing bodies are interested in it. EBITDA is defined for HEFCE’s purposes by BUFDG (link downloads a word document). Quite a technical subject!

EBITDA can be a cash amount (useful if you want to see if you’re over the HEFCE ratio) or it can be expressed as a percentage of income. An EBITDA of £1m might be brilliant if your turnover is £5m, but if your turnover is £100m then perhaps you’re running a little close to the wire.


Disclaimer: I'm not an accountant. If you want to understand EBITDA then the above might help. If you want to pass a CIPFA exam read a CIPFA textbook!

Friday, 24 October 2014

Key Performance Indicators

Look on the webpages of almost every university, tucked away sometimes in the ‘About Us’ pages, and you’ll find a section on KPIs – Key Performance Indicators. I thought it might be useful to write a bit about what they are, and why they matter. In this post I’ll also go into a bit of detail about one common KPI – staff costs as a percentage of income.

A performance indicator simply means a measure of how you’re doing. So, if I’m typing, words per minute might be a performance measure. But so might typos per sentence. Or proportion of screen-breaks properly taken. Performance indicators are quite easy to come up with (although, as a game, making up performance indicators does get pretty tedious after a while).

Now think about all the things that go on in a university (or any other organisation, for that matter), and that can be measured, and you’ll see that a full list of university performance indicators would be huge. And so Key Performance Indicators are those chosen for particular monitoring.

Chosen is an important point here. There’s no absolute list of what matters to an organisation: it is a judgment about strategy and circumstances. There are KPIs which are measured by university regulators (funding councils, research councils etc), and they do tend to get measured by universities, but it is a choice that universities make. Sometimes a pretty obvious choice, but a choice nonetheless.

The choice is made by governing bodies, on the advice of the VC and colleagues in the senior management team. And the choice tells you about what’s important to the University. If a university is trying to improve its student experience, it might have staff:student ratio as a KPI. Or proportion of space of the highest standard. Or NSS results. But if it has a financial focus in its strategy (for investment, or to ensure sustainability) it might choose income per student (or cost per student).

The other jargon term is metric. That’s the actual value associated with a Key Performance Indicator. So for my words-per-minute typing KPI, the metric at the moment is about twenty. And the typos-per-sentence is about two. Could do better! And that’s the point – there’s often a target (or a range) associated with a KPI and metric. I’d like to type with fewer errors and more words per minute. So using a KPI I can measure my progress.

So that’s the theory. Let’s take a common university KPI – proportion of income spent on staff – and look at that in more detail.

The definition is straightforward. Take a university’s total income and its total spend on staff, both of which are in the annual financial statement, and divide the latter by the former. So if my university’s income is £100m per year and staff costs are £55m per year, then the proportion of income spent on staff KPI has a value of 55/100, or 55%.

Here’s two charts which show the KPI. They’re both drawn from 2012-13 HESA data. The first is a scatter plot, where each university is represented by a single dot.


This shows that for most universities, staff costs are in the range of forty-something to sixty-something percent of income, with an average around fifty-something percent. It also shows there are some outliers – some very large institutions, with turnovers of over one billion pounds (and I do hope that the Vice-Chancellors report that to their governing bodies in the style of Dr Evil); and some smaller institutions with much smaller (3% and 14%) spend on staff. These latter are instructive: one is a small consortium institution, and doesn’t directly employ many staff at all; the other is a university with a huge geographical spread and lots of costs associated with teaching over this area. It shows why the choice of KPIs depends on the specific circumstances of the institution. This probably isn’t a very useful KPI for those particular institutions

Now let’s see a different chart, using the same data. This is a bar chart where the institutions are arranged from left to right in ascending order of turnover. That is, the bar furthest to the left is the institution with the smallest turnover, the bar furthest to the right is the institution with the largest turnover. The height of each bar shows the proportion of income spent on staff.


This is more like a range of mountains – a few really high peaks and a few deep valleys, but most are around the 45%-55% range. What’s interesting here, I think, is that there’s no obvious economy of scale – larger institutions don’t seem to have a significantly lower proportionate spend on staff. This is counterintuitive.

So why does it matter? Staff costs are unlike other costs for a university. Firstly, they are long term. Once you’ve employed someone then you need to keep paying them, month after month, and to stop paying them involves effort and more expense. Secondly, they increase on a regular basis. Pay increases incrementally for many, simply by the passage of time as you go up the pay scale; there are pay increases every year through negotiation with Trades Unions; and on-costs – pension contributions and national insurance contributions – go up not down, particularly at the moment. A pound spend on employing someone this year becomes more like £1.05 next year, and so on. To employ someone is to make a commitment. 

And if you’re managing an institution, you need to know about this. If you have to find money each year for a rising staff bill, then you have fewer opportunities for other things – equipment, buildings, training costs, IT, library books, student bursaries and scholarships. So an institution which can manage its staff costs has more opportunities. And when there’s a large increase – a hike in pension contributions, or a big pay award (they did used to happen!) then institutions with a large staff cost as a proportion of income get hit hardest.

There can’t be an absolute target, of course. Some subjects cost more to teach than others – for instance clinical subjects. Some subjects need a lot of equipment, which keeps the proportion of spend on staff low because lots of money is spent on kit. And institutions make choices about how they operate which make an impact – for instance, having small tutorial groups.

There’s no moral to this tale: if you have a high proportion of staff spend, and know why, and it fits with your strategy, maybe that’s fine. And too low a proportionate spend and maybe your institution isn’t doing things as well as it might – universities are people businesses. But the one number does help you understand quite a lot about an institution’s plans and how sustainable it is (that is, how resilient to shocks, or how much scope for investment it has). And if you’re accountable for the institution, or deciding who to give more resource to, those are good things to know.