James Georgalakis, Director of Communications and Impact at the Institute of Development Studies, introduces a new collection of pieces on knowledge for development
If knowledge for development is a social process why do we continue to expect technical approaches alone, such as research methods, websites and policy briefs, to get evidence into action?
While it has been easy to share significant successes of getting research into action through impact awards and case studies, it has proved much harder to institutionalise any learning from these. Put simply, the development sector has continued to struggle to turn research into action. This challenge inspired my work with Ben Ramalingam (IDS), Nasreen Jessani (John Hopkins School of Public Health) and Rose Oronje (African Institute of Development for Development Policy) on a collection of peer reviewed articles about research impact which is published today. We wanted to address the gap between generic impact tool kits and conceptual frameworks and academic studies of research impact. So we commissioned evidenced case studies from those at the research-to-policy front line to provide practical learning for anyone seeking to make better use of evidence. What we learnt has implications for how you fund research, manage knowledge institutionally and promote its use in development policy.
A social process not a technical one
We spoke to NGOs seeking to inform programmes with learning, researchers attempting to influence policy with their studies and donors with requirements around assessing the impact of their grants. A common thread running through the resulting articles is the importance of social relationships in leveraging evidence into action. Put simply, our authors say that networks, individual relationships and cultural norms and politics hold the key to research impact. This is striking given that efforts to maximise research impact often focus on the process of evidence-informed policy as a largely technical and technocratic issue.
The barriers to the use of evidence in policy and practice are frequently described in terms of poor capability on key technical areas. For example, if researchers could just communicate more clearly – write policy briefs or take advantage of snazzy digital knowledge platforms everything would get better. On the demand side – if policy actors were more proficient in the assessment and use of evidence then research uptake would increase. These ideas have led to a sort of fetishisation of research communications products like the now ubiquitous policy brief. Get your research into the perfect brief (or press release or data visualisation) and in front of the right policy maker on the right day, or so the narrative goes, and impact will follow. Given my background is in policy communications you might think this is all music to my ears. A career beckons in teaching academics to write in plain English and how to use Twitter. Job done.
However, the use of research in the sphere of public policy is an extraordinarily complex phenomenon and is only one part of a complicated process that also uses experience, political insight, pressure, social technologies, and judgment. In international development, as with other spheres of public policy, decisions are likely to be pragmatic and shaped by their political and institutional circumstances rather than rational and determined exclusively by research. I admit there is nothing terribly original in pointing this out. The contextualisation of knowledge is well covered by a dizzying array of tool kits and impact guides for researchers. What is most striking about the treatment of this subject in our Collection is that in almost all of the case studies contextualising knowledge hinges on the navigation of power dynamics through personal contacts and informal networks.
It’s people not policy briefs that make the difference
In describing his project’s success in influencing national policy in Sierra Leone to support vulnerable children, ESRC-DFID Joint Fund for Poverty Alleviation Research Programme grant-holder Mike Wessells argues that action research methodology would have been inadequate without the pivotal role played by UNICEF. What he describes is a process that incorporates both a networked approach to social relations and the very individualised dimension of a key personal relationship. It was the research team’s close working relationship with one particular UNICEF staff member that enabled them to navigate the tricky domestic political territory. This is contextualisation built on personal relationships and not on generic stakeholder mapping exercises conducted in workshops. Or as Mike puts it: “researchers who want to have a significant impact on policy should identify and cultivate a positive relationship with a well-positioned person who can serve as both a power broker and a trusted advisor.”
Meanwhile, MSF shared its challenges around bridging its medical research and academic work with local innovation. There was no obvious means of channelling or brokering new knowledge between these groups and vital new understandings, such as correct storage of insulin in the field, simply did not get translated into new practice on the ground. In the end brokerage was institutionalised through new ‘scientific days’ that brought researchers and innovators together in a safe social space for mutual learning. These are the organisational cultural contexts and social norms which shape knowledge systems.
If there is one key message that you take away from this collection I hope it will be that research to policy processes are largely social. Technical capacities matter of course (I do enjoy helping researchers with using social media and writing for policy audiences) but not nearly as much as the social factors. What we realised as we examined the case studies and think pieces we had commissioned is that there is a deeper set of layers to the social realities of knowledge for development. These social factors are: (i) The capacity of individuals and organisations in terms of knowledge and skills to engage in policy processes; (ii) Individual relationships that facilitate influence and knowledge brokerage; (iii) Networked relationships and group dynamics that connect up the supply of knowledge with the demand for it; (iv) and social and political context, culture and norms.
How can we create a more enabling environment for social and interactive knowledge exchange?
Despite this social reality we do not organise or fund our institutions, whether University faculties, NGOs or
consultancies, to nurture this social use of science. Academics often move, on taking their contacts with them. INGOs flip and flop between policy and programme priorities and donors struggle to fund cross-sector collaborations. This is a huge contrast to the private sector: Lobbying firms send a junior staffer to every meeting with the key client to ensure continuity; the hedge fund invests heavily in developing key relationships; and the supermarket buyer carefully establishes close personal relationships with suppliers. These examples may sound incongruous with the development sector but in the health sector at least there are examples of strategies for utilising relationships to leverage evidence into policy.
An understanding of knowledge systems as fundamentally social has profound implications for the current predominance of technical approaches to evidence-informed development. Unless we can be more cognisant of these social realities when designing and implementing programmes we will never escape the general feeling of frustration shared by donors, researchers and practitioners that turning evidence into action is so hard.
These issues are the subject of a high level panel debate broadcast online today at 1600 GMT, co-hosted by IDS, ODI and IIED
The Social Realities of knowledge for Development, published by the ESRC DFID Impact Initiative for International Development Research, can be downloaded for free