The following is a cross-post from a discussion on the importance of asking the "right" evaluation questions on BetterEvaluation.org. Please let me know your thoughts!
52 weeks of BetterEvaluation: Week 28: Framing an evaluation: the importance of asking the right questions
BetterEvaluation recently published a paper which presented some the confusion which can result when commissioners and evaluators don’t spend enough time establishing basic principles and understanding before beginning the evaluation. This blog, from Mathias Kjaer of Social Impact (SI), uses a recent evaluation experience in Philippines to present some tips on how to choose the right questions to frame an evaluation.
Framing an evaluation: The importance of asking the right questions
This blog post deals with a challenge common to both commissioners and implementers of evaluations: deciding which and how many evaluation questions to ask. Both parties have an interest in ensuring that the evaluation provides a sufficient breadth and depth of information. However, both parties are also limited with finite resources and time. The process of refining and targeting the evaluation questions will inevitably involve a delicate balance between collecting as much information as possible and being realistic about what questions can be answered definitely and completely. Ultimately, the questions chosen will dictate the design and direction the evaluation will follow.
Here is an example from our recent experience in evaluating the “Growth with Equity in Mindanao III” (GEM-3) program for USAID/Philippines. Representing the largest and most diverse program carried out in Mindanao, and over 60% of total mission funding directed towards Mindanao, the evaluation was of interest to a wide range of stakeholders inside and outside of the United States and Philippines’ Governments. The program included five distinct programming components (infrastructure development; workforce preparation; business growth; governance improvement; and former combatant reintegration) and two-cross cutting components (communications and public relations; and support services). This meant that the commissioner of the evaluation, USAID/Philippines, needed to consult with a variety of internal stakeholders from different technical offices during their drafting of the evaluation scope of work. What resulted was a list of 54 evaluation questions that the evaluation team was asked to answer during the six weeks of field work (we’ve included the list here if you are interested)!
In a particularly innovative move SI negotiated with USAID to develop a video component to the evaluation to be used for future USAID evaluation trainings. SI partnered with Quimera TV and produced a video which helps convey some of the challenges the team faced in trying to answer this extensive list of evaluation questions, as well as some other data collection challenges common to these evaluations. We hope that the video will serve as a reminder to commissioners and implementers to take the time needed at the beginning of their evaluation to make sure that their evaluation questions are focused and prioritized. This will undoubtedly lead to a better and more informative final evaluation report. Watch the video below or on YouTube here. While drafting our evaluation design during the original proposal, our team noticed that some of the evaluation questions were really just permutations of larger questions and were therefore able to relatively quickly reduce the list to 21 major evaluation questions - still 5-6 times larger than what we would have ideally hoped. Fearing that further refinement might be interpreted as being unresponsive to the Request for Proposals (RFP), we suggested that our evaluation team would work with relevant USAID/Philippines staff to further reduce the number of questions following award and our team’s initial document review. Unfortunately, due to an administrative error, our technical approach was never attached to the final contract and only the original SOW from the RFP was included. This resulted in the team being told during their in-brief and subsequent discussions that they would need to answer all 54 original questions in order to be compliant to their signed contract.
Eight tips for good evaluation questions:
USAID Checklist for Defining Evaluations Questions. Available at http://usaidlearninglab.org/library/checklist-defining-evaluation-questions.
SI is an international development consulting firm based near Washington, D.C., supporting international agencies, civil society, and governments become more effective agencies of positive social and economic change. We offer a suit of services ranging from Program Strategy and Design, Capacity Building and Facilitation, Gender and Social Analysis, but are probably best known for our Monitoring and Evaluation work. We currently provide the main M&E training for USAID and Department of State and hold 13 performance management related IQCs as a prime with USAID, Department of State, USDA, MCC, DFID, and others.
For additional information on our GEM-3 evaluation or SI in general, please feel free to contact: Mathias Kjaer, Program Manager, at email@example.com.
Great tips - thanks for sharing! Would also be interested to hear how the process of answering all 54 questions worked out in practice.
Thanks for the eight tips provided in the article. As a new practitioner, I think the tips provided clarified many issues I have had (currently only theoretically and academically) with the interview design process, and I will be referring to them in the future when designing future evaluation interviews. They are also present the potential to be a quick and effective source to share and use with evaluation teams when constructing interviews. The tips will help when issues of clarity, question count, and question complexity no doubt arise.
I found this article extremely useful and appreciated the breakdown of eight helpful tips to achieve well-rounded evaluation questions. I’m currently a graduate student in a Reflective Practice and Evaluation course. I just completed an assignment on data collection and analysis via interviewing. I used Raymond Gorden’s Basic Interviewing Skills text to guide my assignment, consisting of reflection on the development and execution of an interview process. I found there to be several parallels in the processes while reading this article. While conducting my recent interview I found the process more manageable when utilizing Rubin and Rubin’s tree-to-branch approach. (The Art of Hearing Data) This approach is a way of structuring questions and sub-questions in a way to provide depth of analysis. In the tree-to-branch approach the researcher divides the topic into roughly equal parts and plans to cover each part with a main question (a branch). Each main question is prepared around significant identified issues and followed up to obtain the same degree of depth. This approach could help practitioners limit the number of main questions.
In order to link questions clearly to the evaluation purpose, Gorden would argue you must first clearly define goals and objectives prior to writing the evaluation question. This will aid in narrowing your scope, purpose, and also with transitioning properly from question to question.
Thank you for your insightful post. I've also recently completed an interviewing assignment for a graduate course and can see how your eight tips to creating relevant questions would be useful, along with Gordon's Basic Interviewing Skills book. I've found that taking the time to develop precise goals and objectives for an interview, what you seem to be referencing in your second tip, is essential to generating useful responses, but also instrumental in keeping the interview on track. There were several occasions throughout my interview where I was given a response that I was not anticipating, creating the need to think of additional or alternative questions on the spot. Listening to the transcript, I realized that some of the impromptu question were phrased in a way that wouldn't produce a relevant answer, most likely because I did not take the time to consider my objectives when I asked them. Your sixth tip also resonates with me as I am in the process of designing an evaluation for a project proposal. I plan on including participant interviews in the evaluation design and need to be sure that I am designing questions that generate information that stakeholders would find valuable.