26.06.14
Finding out what works
Source: Public Sector Executive June/July 2014
Andrew Carter, deputy director at the What Works Centre for Local Economic Growth, discusses the public sector’s role in boosting business and aiding growth.
The What Works Centre for Local Economic Growth recently published its second evidence review, on business advice and mentoring policies. We were interested in whether programmes like Business Link, and export support schemes like those run by UKTI, help firms raise their productivity, employment or sales. Our findings are of particular interest in light of the ongoing Strategic Economic Plan (SEP) and Growth Deals processes.
The review considered almost 700 policy evaluations and evidence reviews from the UK and other OECD countries. Of these, we found 23 impact evaluations that met the Centre’s tough minimum standards. This is a smaller evidence base than for our first review on employment training, and is a very small base compared to some other areas of public policy, such as education, health and some aspects of international development. The size reflects both the difficulties of doing robust impact evaluation in the economic development field, and some challenges related to policy design.
Overall, of the 23 evaluations reviewed, 17 found positive impacts on at least one business outcome, such as productivity, employment or sales. Four evaluations found that business advice didn’t work – with no statistically significant effects on any outcome – and two evaluations found that business advice might in fact be harmful. Given the scepticism that is sometimes levelled at business support programmes, this overall success rate should be quite encouraging for policymakers in Whitehall, LEPs and local government.
Underneath that headline finding, the review identified a number of key messages for policy-makers relating to policy objectives, design and delivery. First, business advice programmes show consistently better results for productivity and output (GVA) than they do for employment. This may be because it is easier to help firms raise their productivity, or because productivity increases come first, with employment gains seen in the longer run.
Second, programmes that use a hands-on, ‘managed brokerage’ approach may perform better than those using a ‘light touch’ approach – although this conclusion is based on only one direct comparison study. Taken at face value, this suggests that a strong relationship between advisor and client may be important to achieving positive programme outcomes. It is not clear, however, which of these two approaches is more cost-effective.
Third, in many cases, assessing programme success or failure was not straightforward. Understandably, many business advice schemes have multiple objectives – but too often these are not clearly expressed, and the outcomes actually evaluated do not always bear much relation to the objectives. Clearer policy design, and tightly-commissioned evaluations, would help improve this in future.
Finally, it is unclear as to which type of support is most cost-effective. For example, the costs of ‘light touch’ versus more intensive business advice vary dramatically, yet we found only one evaluation that directly compared the effectiveness of these two types of support. Similarly, only five of the 23 shortlisted studies used cost-benefit analysis that assessed cost-effectiveness, and not all of these used measures that are comparable across studies. Given current public spending constraints, this will be disappointing for decision-makers looking to establish value for money on programme spend.
These findings raise several issues that have implications for some of the policy debates currently taking place between national and local policy-makers as part of the Strategic Economic Plan (SEP) and Growth Deals process.
The first issue relates to the design and delivery of business advice policy. Surprisingly, the review did not find any conclusive evidence as to whether the design and delivery of the business advice programmes is more effective when undertaken by local or national organisations. This is significant given the ongoing debates on this subject within City Deals and Growth Deals, which have focused too heavily on who should design the policy, rather than what the characteristics of the policy should be and how it will be delivered.
The second issue relates to the size of impact we can expect from policy interventions. When the 39 SEPs and Local Growth Fund deals are announced, they are likely to be accompanied by some very optimistic statements about their potential impact on people, places and businesses.
However, the business support evidence review shows that while the overall effects of such interventions is positive, the size of the effect is relatively small for each individual firm.
The third crucial policy issue to emerge from our review is that there is a clear and urgent need for better quality evaluations, especially in the areas of business support policy where it is still unclear as to what performs best, such as in levels of delivery, public and private sector roles, and in particular, cost-effectiveness.
So where do we go from here? Although many local authorities are really feeling the pinch, local flexibility that allows for greater experimentation does provide an opportunity to undertake such evaluations. We are starting conversations with a number of LEPs and local authorities on how the What Works Centre can assist.
If you are interested in helping us experiment in this area and improve our understanding of what works, please don’t hesitate to get in touch with the Centre.
Tell us what you think – have your say below or email [email protected]