You are here
Unintended effects of international cooperation at a glance
In the field of international cooperation, whether programs are focusing on governance, food security, livelihoods, peacebuilding or gender, they all have a common deficiency; capturing unintended effects. Though, all programs almost inevitably generate some. Indeed, no more than 30 reports on international organisations over the last ten years were looking at unintended effects and only 15% of USAID and 26% of NORAD evaluations identified unintended effects.
How can we account for unintended effects but more importantly avoid negative ones and maximize positive ones? When the Dutch Ministry of Foreign Affairs, in partnership with the University of Radboud, launched a call for papers to host a conference on the unintended effects of international cooperation back in September 2016, we (Search for Common Ground) decided it would be a unique opportunity to look back at our evaluations and examine the unintended consequences of our peacebuilding programs!
As a result, we conducted a meta-evaluation of our programs from 2013 to 2016 and presented our findings at The Hague on January 16th&17th (see infographic below and watch this space for the final paper to be published soon) to an enthusiastic crowd of policy-makers, academics, practitioners, evaluation experts, and journalists.
The first panel discussed the lack of unintended effects in evaluation and what could be done about it. One might advance that this is the result of the political nature of evaluation and that beforehand, we should ask who commissions it or who funds the program. However, it was also agreed that undeveloped methodologies, coupled with little interest and underestimated relevance about capturing what could hinder the successful achievement of a project, might be the root of the problem. Others attributed this to the absence of human oriented approaches in development and recommended emphasising the human factor within the scope of an intervention as well as outside it. It is true that common practice invites evaluators and funders to concentrate on outputs and outcomes as defined in the project Theory of Change and Logical Framework and ultimately, weigh them against the OECD-DAC criteria of relevance, effectiveness, efficiency, sustainability and impact. However, by focusing on the expected, we are often missing the unexpected.
Our theory lies in the premises that the peacebuilding sector was no exception. Due to the nature of the field and the fast-evolving context in which we operate, we have no choice but to respond to challenges quickly and early enough whilst ensuring conflict sensitivity. In practical terms, this means developing innovative evaluation and monitoring techniques, and approaches, to understand the relationship between impact and complex contexts.
The second panel discussion debated whose unintended effects we should be looking for. In other words, beyond the target group, how far should we (or can we) go in the identification of unintended effects on secondary beneficiaries or intermediaries? Unsurprisingly, it was found that most evaluations exclusively focused on primary beneficiaries. However, a participant from the private sector described how ethnographic immersion of its staff were supporting the monitoring of their Corporate Social Responsibility programs and helped minimize the effects of what they called “the black box of development”, in other words, the unplanned. Additionally, very few studies were paying attention on intermediaries - all of the actors of the aid chain from peacekeepers, local aid workers, diplomats, etc.- and how interventions were affecting them. Overall, qualitative methodologies such as outcome mapping/harvesting, network and stakeholder analysis, critical incident technique, and a use of the Theory of Change as a flexible and evolving evaluation tool were favored. Finally, robust monitoring will more readily capture these unintended effects to either capitalize on or deal with them.
Finally, the third panel discussion looked at the impact of aid and trade agendas on the unintended effects debate. When advancing ‘mutual interest’ development policies, to what extent are these agreement really benefitting recipients and how do governments account for their policies’ unintended effects? It was agreed that when looking at previous experiences, the priorities primarily relied on the funder’s political, economic or other strategic interests, which would almost de facto generate unintended effects, often negative and sometimes greater than the predicted added value of the original program goal.
However, conclusions reached about the perceived level of flexibility between funders, practitioners, and aid recipient (or participants) in programming were far from agreed. Some conference attendees still experience a strong disconnect when thinking about unintended effects and programming and advanced that policy-makers and donors lack interest in capturing them. On the other hand, the mere existence of this conference, and other platforms such as USAID learning lab encouraging budget for monitoring and reflection, illustrate the growing interest from “the top” to align with what works for the “bottom”. Whilst looking at unintended effects will depend on your place in the system (whether you are a funder, a project manager or an evaluator), demonstrating changes about what works or what doesn’t and why, if backed up by facts and evidence tends to make room for flexibility and understanding.
To conclude, it seems that the importance of capturing unintended effects has already won hearts and minds of conference’s attendees, however there is still work to do to gain buy-in from, and raise awareness of, the different actors of international cooperation, as well as shifting organizational cultures. Closing remarks from the Academic advisor of the Ministry of Foreign Affairs suggested isolating different topics of unintended effects deemed relevant and starting researching on it. I would go even further, proposing an online platform following the model of the What Works Network of the British Cabinet Office which could reference data on unintended effects across the field of international cooperation.
Melanie Pinet is Design, Monitoring & Evaluation Associate for East and Southern Africa at Search for Common Ground. Melanie tweets from @Melanie_Pinet