SOLUTIONS TO EVALUATION CHALLENGES
In Part One of this article series we discussed six key challenges that can arise when conducting evaluations in the research world. Evaluation work is tough; mistakes can be made; big complications can arise, and the challenges can feel overwhelming. Here we discuss solutions to the evaluation challenges that we highlighted, which we hope will guide you towards more effective evaluations.
Challenge 1: Poor Planning
Solution: Always have an Action Plan.
The evaluation plan should be developed in a manner that it is flexible so that adjustments can be made anytime within the context of the work plan to account for issues that may arise during the evaluation process. It is essential to understand the rationale behind developing an evaluation plan, the key elements that should be included and the steps required in developing it. Below, we have attempted to break down these elements into different steps for more clarity.
• Identifying evaluation purpose and expectations. This involves a meeting with the client to build consensus over the survey purpose, main expectations, the link with various stakeholders, literature review documents and the mode of operation.
• Identifying programme/project goals, objectives, and inputs. In the planning stage, it is essential to understand the projects goals, objectives, and relevant activities. Below are definitions of goals, objectives, and activities/inputs.
o Goals: The final impacts on the lives of the beneficiaries or the environment that the project intends to achieve.
o Objectives: longer-term change in the environment or the behaviour of project beneficiaries that is needed to achieve the overall goal.
o Activities/Input: direct interventions and processes of the project.
• Identifying key players. This involves identifying key internal and external stakeholders who involved in the project including beneficiaries. The key stakeholders include the project team, donors, stakeholders in the wider community (community groups, networks, residents etc.), partner organisations, local and national policy makers, other government bodies/ministries and the project beneficiaries.
• Developing key evaluation questions. These questions are developed by the evaluation Experts with input from the client and partners. When the questions are answered, the Consultants will collect the data required to make conclusions and recommendations. By identifying these questions early in the process, Consultants are prepared to design tools, instruments, and methodologies required to gather the needed information. The questions, however, may require revisions every now and then depending on the status of the project.
• Cost estimates for the evaluation activities. It is essential to review the budget already submitted incase it needs revision. Ensure all elements are catered for such as personnel, capacity development, infrastructure, logistics, taxes, etc.
• Understanding the overall context. The Consultants should understand the political and administrative structures of the project areas, along with the roles and influences of existing policies that may affect the evaluation process.
• Possible limitations and risk management. The Consultants will need to brainstorm around the potential risks and unexpected circumstances that might arise during the assignment, for example, inadequate feedback from key beneficiaries, COVID-19 restrictions, difficulty in reaching key stakeholders etc.
• Workplan. An evaluation work plan involves the development of clear timeframes, deliverables, and milestones. It should state who is accountable for different phases and activities of the evaluation and include risk management strategies and flexibility to deal with unforeseen events without compromising the timeframe or methodology.
Challenge 2: Ineffective Approaches
Evaluation approaches address specific evaluation questions, processes, and challenges. Before starting an evaluation, research on which approach is right one for the evaluation. Evaluations are not a one-type-fits all; different approaches work with different processes. Some of the approaches used in evaluations are listed below.
• Lesson Learnt Approach (LLA). LLA is a strengths-based approach designed to support ongoing learning and adaptation by identifying and investigating outlier examples of good practice and ways of increasing their frequency to achieve desired results.
• Beneficiary Assessment. An approach that focuses on assessing the value of an intervention as perceived by the (intended) beneficiaries, thereby aiming to give voice to their priorities and concerns.
• Case study. A research design that focuses on understanding a unit (person, site, or project) in its context, which can use a combination of qualitative and quantitative data.
• Human Rights Based Approach (HRBA). HRBA is directed towards promoting and protecting human rights, based on international human rights standards. It puts human rights and corresponding state obligations at the heart of the evaluation.
• Most Significant Change (MSC). MSC approach is primarily intended to clarify differences in values among stakeholders by collecting and collectively analysing personal accounts of change.
• Outcome Harvesting (OH). OH, is an impact evaluation approach suitable for retrospectively identifying emergent impacts by collecting evidence of what has changed and then, working backwards, determining whether and how an intervention has contributed to these changes.
• Outcome Mapping. An impact evaluation approach which unpacks an initiative’s theory of change, provides a framework to collect data on immediate, basic changes that lead to longer, more transformative change, and allows for the assessment of the initiative’s contribution to results.
• Participatory Approach. A range of approaches that engage stakeholders (especially intended beneficiaries) in conducting the evaluation and/or making decisions about the evaluation.
Challenge 3: Key Evaluation Questions (KEQs)
Solution: Involve Key Stakeholders in Developing the KEQs
Key Evaluation Questions (KEQs) are the high-level questions that an evaluation is designed to answer. Having an agreed set of (KEQs) makes it easier to decide what data to collect, how to analyze it, and how to report it. KEQs need to be developed and agreed on at the beginning of evaluation planning. The questions should also address context, reasons for adaption and emergence of activities and outcomes, different perspectives and inter-relationships that impact project success, sustainability, and transferability. A maximum of 5-7 main questions will be sufficient. It is also useful to have some more specific questions under the KEQs. KEQs should be developed by considering the type of evaluation being done, its intended users, its intended uses (purposes), and the evaluative criteria being used. For example, the OECD-DAC criteria of relevance, efficiency, effectiveness, and impact of the programme provides a good starting framework for a range of initiatives in development areas (health, natural resource management, community resilience, etc.).
Challenge 4: Data Collected
Solution: Ensuring Accurate and Appropriate Data Collection
The aim of data collection is to measure something precisely, or gain large-scale statistical insights, collect quantitative data. Depending on your KEQs formulated, the Consultants will either collect quantitative or qualitative data or both. Quantitative data is expressed in numbers and graphs and is analyzed through statistical methods while Qualitative data is expressed in words and analyzed through interpretations and categorizations. It is also important to ensure the integrity of the data collection process and data collected. Quality assurance and quality control are two approaches that can preserve data integrity and ensure the scientific validity of the evaluation results.
• Quality assurance. An important component of quality assurance is developing a rigorous and detailed data collector’s recruitment and training plan. Implicit in training is the need to effectively communicate the value of accurate data collection to trainees. The training aspect is particularly important to address the potential problem of staff who may unintentionally deviate from the original protocol.
• Quality control. Quality control identifies the required responses, or actions necessary to correct faulty data collection practices and minimize future occurrences. The Consultants should communicate data collection procedures are well and the necessary steps to minimize recurrence of faulty data.
Challenge 5: Participation of Key Stakeholders
Solution: Understand and Engage Stakeholders
Engaging stakeholders is important to understand and consider the priorities and concerns of different stakeholders. This informs evaluation planning, communication strategies during and after the evaluation and supports the utilisation of evaluation findings. It is important to understand different perspectives on what will be considered credible evidence of outcomes and impacts. Involving stakeholders during an evaluation implementation can add value by:
• Providing perspectives on what will be considered a credible, high quality and useful evaluation.
• Contributing to the program logic and framing of key evaluation questions.
• Facilitating quality data collection.
• Helping to make sense of the data that has been collected.
• Increasing the utilization of the evaluation’s findings by building knowledge about and support for the evaluation.
Challenge 6: COVID-19 Pandemic
Solution: United Nations Guidance Note Towards Evaluations During the COVID-19 Pandemic
The United Nations Guidance Note towards evaluations during the COVID-19 pandemic, which contains a decision matrix for specific actions based on the stage of the evaluation that shows the Consulting team how to adjust to COVID-19 in ongoing and planned evaluations. The following points summarize the guidance note: –
• Consultants will ensure a do no harm approach.
• Consultants will adhere to Humanitarian Ethical Guidelines for Evaluations.
• Consultants will safeguard the quality standards of evaluative work including representation of all stakeholder groups, leaving no one behind.
• The evaluation will only be undertaken when there is a clear plan for utility.
SDS has identified some options for remote data collection and analysis methods, as well as their limitations and specific requirements. These options include: –
• Surveys via mobile phone, email, online tools
• On-line discussion platforms
• Video calls
• Remote observation via on-line feed or video footage
• Web scapping
• Web search data analysis
• Crowd sourcing.