The 2008 university Research Assessment Exercise (RAE), whose results have been announced with a mixture of fear, loathing and exhaustion, is a classic example of the self-defeating performance-management drive that is overwhelming the public sector.
RAE results determine the research funding allocated to institutions by the Higher Education Funding Council, according to a formula that changes each time. The official line is that the assessment - 2008's is the sixth since 1986 - is a success. It is "important and valuable", to quote one vice-chancellor, in providing an accepted quality yardstick and a means of promoting UK universities abroad. Others argue that it helps to ensure accountability for £8bn of public funding, the largest single chunk of university income. That sounds plausible: but as usual it conveniently airbrushes out other costs and consequences.
The first and most obvious of these is colossal bureaucracy. Government blithely assumes that management is weightless; but the direct cost of writing detailed specifications and special software, and assembling 1,100 panellists to scrutinise submissions from 50,000 individuals in 2,500 submissions, high as it already is, is dwarfed by the indirect ones - in particular, the huge and ongoing management overheads in the universities themselves. As with any target exercise, the RAE has developed into a costly arms race between the participants, who quickly figure out how to work the rules to their advantage, and regulators trying to plug the loopholes by adjusting and elaborating them.The result is an RAE rulebook of staggering complexity on one side and, on the other, the generation of an army of university managers, consultants and PR spinners whose de facto purpose is not to teach, nor make intellectual discoveries, but to manage RAE scores.
Simon Caulkin in the Observer, the rest here.