Many of us who study public policy academically often discuss just what impact our work has – do we influence anything?
With the Research Excellence Framework (REF) exercise fast approaching, academics across Britain are busily putting together ‘impact’ statements to show just how much impact they have had. And one crucial area of ‘impact’ is on public policy. Everyone has been thrashing around for metrics.
So it was with great interest that I started playing around with the ‘Who’s Lobbying’ database (after my colleague at Manchester, Alex Waddington, spotted it).
We’ve been doing some work at Manchester on trying to encourage engagement specifically with Parliament. (Not least because as a serial offender I think it’s a good thing to do).
I was therefore fascinated to do the following little analysis of (some) Universities engagement with both Westminster (Parliament) and Whitehall (Government).
I picked what I would think of as the top ten ‘policy’ Universities in the UK, based purely on my personal knowledge. So this may be a biased sample and I’m now planning to do a much more thorough research project. But these data, from this limited sample, seems interesting enough to discuss.
The first and most obvious thing is that there appears to be a big imbalance between how open Government is to academics compared to Parliament. Nealry twice as many appearances before Select Committees as ‘meeting with government’ (which is mainly meetings with Ministers).
Of course there are caveats: we don’t know what the quality of either interaction’s are; the data probably misses lots of ‘under the radar’ meetings with Government; etc.
But even with those taken into account, it still seems that Parliament is more open to academic input. And to some extent that’s precisely what you’d expect. Parliament, and especially Select Committees, tries to scrutinise the work of Government, with very few resources. So they are more likely to want to hear from (unpaid) experts who may have interesting and critical things to say. Conversely, Government is still, by its own admission in the recent Civil Service Reform Plan, too much of a ‘closed shop’ when it comes to policy-making.
The second major point this data seems to confirm is the ‘golden triangle’ of London, Oxford and Cambridge holds a near monopoly on engagements with both Whitehall and Westminster. The five Triangle universities (LSE, Oxford, UCL, Cambridge and Kings) account for 77% of Parliamentary appearances and 83% of meeting with Government in this sample. “The Rest”, the other five, are lagging way behind.
Again, there is no great surprise here: the dominance of graduates from the the Triangle universities in the political elite and civil service (including I suspect the parliamentary service) makes this result very likely.
University |
Oral evidence sessions* 2010-13 |
Government Meetings |
LSE |
53 |
32 |
Oxford |
44 |
22 |
UCL |
29 |
23 |
Cambridge |
24 |
14 |
Kings |
18 |
8 |
Manchester |
12 (15**) |
4 |
Birmingham |
10 |
2 |
Cardiff |
10 |
11 |
Edinburgh |
7 |
3 |
York |
7 |
1 |
TOTAL |
214 (217) |
120 |
Source: http://whoslobbying.com accessed 12 July 2013
* Appears to cover only House of Commons Select Committees
** Figure in brackets includes missing data from Manchester.
NB: There are clearly some issues about reliability of the data, because when I checked my own entry two of my appearances are missing. For one I think because I was described as ‘Manchester Business School’ rather than University of Manchester (which also affected one other appearance by another colleague) and another was for a House of Lords Select Committee which wasn’t picked up.
For an interesting, and very generous, attempt to help bridge the academic-practice divide from my friend and colleague Prof. Beryl Radin, see here: http://www.apsanet.org/content_85431.cfm
This is a fascinating piece of quick analysis, making the most of what is clearly very sparse data – but I think there may be a risk of drawing a stronger conclusion than its robustness supports. It may well be true that Parliament is more open to academic input than government, but I am not convinced that this data is reliable enough to tell one way or the other. As you say, “The data probably misses lots of ‘under the radar’ meetings with Government”.
1. It seems intuitively unlikely that the total number of meetings with a small number of ministers is larger than those with a much larger number of civil servants. Intuition isn’t evidence, of course, but my personal experience is very much in line with that. And to take an example at random, York is shown with a single meeting – but I find it had to believe that, say, the team at SPRU can only manage a single direct contact with government between them.
2. A count of meetings is clearly not the same as a measure of influence. Just last week I was at an event at which two academics (from two diferent universities, neither in the top ten list) were presenting their findings to a room full of at least fifty people drawn from a large number of departments and NDPBs. If it were to be counted anywhere, it would count as one, but its potential influence was enormous.
3. Anecdote again, but I would estimate that more than half of the professional contacts I have had with academics over the years has been with people from institions not on your top ten list. That doesn’t mean that the list is wrong, but may mean that influence is more dispersed than it first appears.
All good points – I think I put all these caveats into the report and this was a ‘quick and dirty’ look to see if it was useful. Obviously by itself it is limited, but useful enough for us to do further work on their data and also to think about ways of expanding and triangulating.
Best, Colin