The Higher Education Statistics Agency (Hesa) made a decision in October not to publish data showing what proportion of academics who were eligible to be included in the RAE were actually entered into the exercise by each institution. But this has left a gaping hole in the information needed to produce accurate league tables.
Be that as it may - the league tables are duly being produced. The two UK institutions with which I'm affiliated are ranked joint 14th (University of Sheffield and University of Leeds - along with Durham, St. Andrews, Southampton and Bristol). Of the two departments I'm associated with, Information Studies at Sheffield scores 2.850 (the highest score in the Library and Information Management group) and the Business School at Leeds scores the same, but that makes it joint 11th in the Business and Management Studies group. Postlethwaite has a point, however, download the data, create a column to multiply the average point score by the number of staff returned and you'll get a rather different ranking :-)
There are some oddities in the category, Library and Information Management, since only eleven of the twenty-one institutions submitting staff to this panel have what we might call 'traditional' library and information management departments. Some have clearly adopted a 'strategic' approach by submitting information systems departments to this panel -
Brunel, Salford, Sheffield Hallam, for example - whereas others submit such units to computer science. Some might call this sharp practice, but then, 'all is fair...' in university financing :-) King's College, London submits a research unit on 'digital humanities', where, again, most of the research appears to be in aspects of computer science, rather than information management, and it clearly disadvantages staff who have both teaching and research responsibilities when they are assessed against staff with only research responsibilities: some weighting ought to be applied, perhaps by treating all research staff as only half of a full time teaching and research staff member.
In other words, everyone is playing games with the RAE, as they have done since the Higher Education Funding Councils adopted the process - what difference it makes to the quality of research is anyone's guess, but some have argued that the increase in the publish or perish syndrome results in a decline in quality.
It is assumed that the next RAE will adopt, at least in part, a 'metrics-driven' approach, i.e., bibliometrics. What the result may be for some can be pursued in a couple of papers in the journal - check the subject index under "Research Assessment Exercise"