Community Driven Development: Howard White and Radhika Menon respond to Scott Guggenheim

July 11, 2018

     By Duncan Green     

Howard White and Radhika Menon respond to Scott Guggenheim’s recent post on Community Driven Development

Evaluations have two functions: lesson learning and accountability. We believe that our report on community-driven development offers useful lessons for programme managers, practitioners and researchers. Despite posting a blog response to earlier comments, a critical backlash continues. This is disappointing – especially as our ‘critics’ use the arguments we make in the report to make arguments against us. We agree with virtually every point in Scott Guggenheim’s blog, though he writes as if we are at loggerheads. So, we would like highlight the important lessons, which are mostly areas of agreement, and set the record straight over perceived shortcomings in our report.

Our review is not a critique of CDD. And we do not say “CDD does not work”. We find that CDD has been enormously successful in delivering small scale infrastructure. We report figures on the thousands of facilities built which serve thousands of people in many countries. We do say that there is insufficient evidence on cost effectiveness. So, this is an area for further research. We do not make a recommendation to switch to local government procurement.

The positive effect on infrastructure has however not always resulted in improvements in higher order social outcomes. Meta-analysis of impact evaluation results show that CDD programmes have a weak effect on health outcomes and mostly insignificant effects on education and other welfare outcomes. While these are the overall findings, the report identifies and unpacks instances where there are positive effects.  For example, there was an increase in girls’ enrolment in Afghanistan despite education being a very small share of overall investments. We attribute the increase to changing gender norms supported by the National Solidarity Programme (NSP). Our explanations chime with what Scott has to say. But somehow his blog makes it seem like we are in disagreement with him.

Further on gender norms, the impact evaluation of the second phase of NSP in Afghanistan did find ‘mixed effects’ of the programme on gender norms. Men’s acceptance of women in leadership in local and national levels had increased, as had women’s participation in local governance. However, NSP did not lead to any change in men’s attitudes towards women’s economic and social participation or girls’ education. Scott’s blog reports only the first and not the second finding in arguing we are wrong to say the evidence is mixed.

But our statements in the report are anyway based on reviewing the evidence not from just one case, but from over twenty CDD programmes, where success in promoting inclusiveness is certainly mixed. We argue that programme designers can learn from where there has been more success, as in Indonesia.

Our examination of variations in impact on education outcomes is an example of how we analyse heterogeneity. Heterogeneity is the friend of meta-analysis, not its enemy. Meta-analysis allows us to explore variations in design and context, which explain programme performance. But we have been criticised for including a range of programmes, especially social funds. Such criticism fails to recognise how social funds evolved – those in Malawi and Zambia came to be run using a very CDD approach. An external agency may be involved in vetting that decision or deciding between competing proposals, that is one of the CDD design variations. But all programmes – not just social funds – imposed some limitations on the use of the funds.

Country against social cohesion

Our review finds that CDD has no impact on social cohesion (see right). There is no heterogeneity there. The lack of effect on social cohesion is consistent across contexts. It is in building social cohesion that CDD has not worked. This is where meta-analysis is so useful, as it clearly illustrates the consistency of this finding. Scott’s blog concedes this point. So we are on the same page as far as these findings are concerned, which is not an impression you get when you read Scott’s blog.

As we say in the report, the lack of impact on social cohesion is not a new finding. Indeed, one of us was a co-author of the 2002 OED review of social funds – including the CDD-like Malawi and Zambia funds – which reported no impact on social capital, as did the OED CDD report three years later. The review confirms this finding now that we have additional evidence from high quality impact evaluations.

An issue for further research we did not flag and should have done is long run effects on governance of the longer run programmes. We do distinguish between those programmes which have a multi-year, multi-project investment in communities and the ‘single shot’ designs. It is plausible that the former, like the Kecamatan Development Programme in Indonesia and Kalahi-CIDDS in the Philippines would have a larger impact. But there is no evidence of this. We can say there may be a longer run impact, which further research could assess.

Reviews intend to be comprehensive, but have inclusion criteria. So, some studies people think should be included get excluded. What matters is that we use relevant evidence to analyse whether programmes have worked, and delve deeper into questions of what they have worked for and why they have worked or not worked. People who have worked on specific CDD programmes have both a richer viewpoint but also a more restricted one. Scott also presents research findings from contexts he is associated with. This does not mean that we discount what he has to say but it has to be put into the bigger picture.

Reviews do have their methodological limitations and evaluations can have different findings across contexts. The answers are not straightforward; they are often nuanced. Our plea to the development community would be to resist getting into ‘My study is right’ and ‘Your study is wrong’ debates and spend more time in constructive conversations about using evidence to inform programmes that can improve lives. Yes, we say CDD doesn’t build social cohesion. But we don’t say CDD doesn’t work – the answer depends on the outcome you are looking at. And there have been substantial variations in CDD’s effectiveness, so let’s learn from the variations to design better programmes.

For those interested, we also recommend reading the full report (rather than the brief which provides just a summary that some have reacted to).

Comments