8th July 2014
LSE’s recent JSRP conference on politics and evidence in international development highlighted problems inherent in the ways ‘evidence’ tends to be defined and used in mainstream development discourse and practice. How can Evidence for Development’s individual household method (IHM) and current involvement in the ‘Assessing Rural Transformations’ project inform the discussion and help to oil the relationship between evidence and policy?
Last Tuesday’s LSE Justice and Security Research Programme (JSRP) conference ‘Can Politics and Evidence Work Together in International Development‘ featured many insights that are highly relevant to our work at Evidence for Development (EfD).
Critiques of the ways that ‘evidence’ tends to be defined and utilised in mainstream development discourse and practice ran through the conference. The tension between wisdom derived from ground-level work and constraints imposed by donors was highlighted – for example, the need to pre-define programmes and deliver neat results, despite the fact that effective development often requires levels of ‘messiness and hybridity’. Another interesting point raised was the problem with using scientific methods to inform practice in social settings. For example, the use of randomised controlled trial in development practice was widely criticised due to the fact that no two social settings are perfectly similar. Nevertheless, as was generally recognised, and explicitly pointed out by Steven Rood from The Asia Foundation, evidence is crucial in motivating political action and ensuring that development interventions achieve the desired results, or at the very least don’t make things worse.
So we should not turn our backs on evidence, but rather find ways of improving the quality of information collected and the integrity of the methods used to assemble and analyse it. Here, it seems, organisations explicitly concerned with improving data-collection methods, such as Evidence for Development, can play a key role.
The day after the JSRP conference, EfD hosted one of the regular Assessing Rural Transformations (ART) consortium meetings, with project partners from the University of Bath, Self Help Africa (SHA) and Farm Africa. This ESRC/DfID-funded project is piloting new approaches to assessing the impact of development projects on rural communities in Ethiopia and Malawi. Experiences from the ART project are relevant to key themes discussed at the conference, particularly regarding the ways evidence is conceptualised and subsequently used.
For example, criticism of the meaning given to the word ‘evidence’ came up throughout the conference. Rosalind Eyben (Institute of Development Studies) pointed out that ‘evidence’ is frequently used without a reflexive understanding of the contextual factors which shape the way it is understood and utilised. A consequence of this, which Duncan Green (Oxfam) termed the ‘delusion of data’, is a tendency to prioritise hard, numerical data over other, ‘softer’ forms of evidence that may be less easy to quantify but are essential to understandings of the social realm. Evidence for Development’s individual household method (IHM) helps to break down this implicit hierarchy by gathering both quantitative and qualitative data from households and providing easy-to-use analytical tools. The ART project further builds on this by combining the IHM studies with data gathered through qualitative impact protocol (QUIP) interviews with non-primed but self-reporting beneficiaries, to facilitate ‘good enough’ assessment of project impacts. Such an approach maximises the ability of quantitative and qualitative data to complement each other and inform an optimally rich and useful analysis.
One case which clearly illustrates the benefit of having both quantitative and qualitative data concerned time-frame issues in relation to SHA’s goat rotation programme in Masumbankhunda, Malawi. This is a fairly typical NGO project, where families are given goats and then required to pass on offspring to other families in the village. The full impacts of programmes of this kind are not realised within the 2-4 year time-frame of most impact studies (including that of the ART project). As a result, quantitative impact data will clearly be lacking on some aspects of the project. Nevertheless, the QUIPs generated rich information about the goat rotation programme. Factors such as confidence in future improvements and feelings of community solidarity fostered by the scheme can be deduced from this information and used to enrich conclusions about impact.
Another important issue raised at the conference was the difficulty of generating genuine ‘evidence-based policy’ rather than ‘policy-based evidence’, whereby organisational politics influence initial decisions about the evidence required and thus pre-determine the analysis of a given situation. The IHM addresses this issue by providing detailed, holistic data on the entire economy of individual households, including connections with other households through gifts and transfers, rather than focusing on pre-selected factors which reflect the interested organisation’s theory of change (although specific areas of interest can be explored in greater depth through supplementary questions). The QUIPs used in the ART project also self-consciously address this issue by requiring that interviews are conducted by independent researchers who have not been informed of specific impact hypotheses or the theory of change.
The ART project faces the same logistical issues – for example, limited time-frames and funding – that featured throughout the JSRP conference and are common to most development (and non-development) initiatives. Nevertheless, the ART meeting highlighted the need to award absolute priority to the quality of evidence collected. This is not a desirable extra; it is a precondition for the design of effective programmes and an essential means of monitoring impact. Thus, organisations such as EfD and projects such as ART which take the integrity of data as a primary focus seem to be an essential part of oiling the relationship between policy and evidence in international development.
Categories: Data, IHM, Malawi, Monitoring and evaluation, Policy and programme design
No comments yet.
The comments are closed.