skip to content


Welcome to your Evaluation Resource Library! Whether you're in the midst of evaluation work or gearing up for it, this curated collection is designed just for you. We've crafted it in the form of an evaluation pathway, offering a treasure trove of resources to support you every step of the way.

Explore our pathway to discover resources covering everything from managing evaluations to mastering data collection, engaging with service users, and effectively communicating and acting on findings.

Feeling stuck? Unsure where to start? Simply click on the tab that resonates with your current needs, and unlock a wealth of resources tailored to address your challenges.

And if you still find yourself in need of personalized assistance, please check our webinars or book a drop-in clinic session with Professor Andy Jones who is a public health academic with many years of experience undertaking public health evaluations.   

A Guide to Undertaking an Evaluability Assessment

If you're looking to understand how to undertake an evaluability assessment, this guide is for you. An evaluability assessment helps determine if an evaluation is appropriate and, if so, how in-depth it should be.


About the Guide

This guide is created by the Methods Lab, an action-learning collaboration between the Overseas Development Institute (ODI), Better Evaluation (BE), and the Australian Department of Foreign Affairs and Trade (DFAT). It provides an overview of evaluability assessment and its use in impact evaluation. You will find:

Planning for an Evaluability Assessment: Insight into the planning process.

Checklists and Decision Support Tools: Useful tools to guide you through the assessment.

Post-Assessment Actions: What to do once the assessment is completed.

Lessons Learned: Insights from practical experiences with evaluability assessments.

You can download the guide.


Additional Resources

UK Department for International Development Checklist

The UK Department for International Development has produced a simple tabular Evaluability Checklist. This checklist can be used as a standalone tool or incorporated into an evaluability assessment, with results summarized in a substantial written report. Download the checklist. This checklist is from pages 19-23 of the report:

Davies, R. (2013). Planning Evaluability Assessments: A Synthesis of the Literature with Recommendations. Report commissioned by the Department for International Development.

UNODC Evaluability Assessment Template

The United Nations Office on Drugs and Crime (UNODC) has created an Evaluability Assessment template. This template guides users through a simple set of steps to decide whether to evaluate an intervention. While it does not cover detailed evaluation considerations, it is a useful quick decision support tool. 

These resources will help you effectively determine the appropriateness and depth of your evaluation, ensuring a comprehensive and informed approach to your projects.



Understanding Evaluation Frameworks: Logic Model and Theory of Change

What is an Evaluation Framework?

An evaluation framework is a structured and systematic approach used to assess and analyze various aspects of a project, program, policy, product, or process. It provides a foundation for designing an evaluation plan by specifying inputs, activities, and expected outcomes. Two commonly used frameworks are the Logic Model and the Theory of Change.

The Logic Model

What is a Logic Model?

A logic model is a visual representation and systematic tool used to illustrate the relationships between various components of a program, project, or intervention. It serves as a foundation for designing an evaluation plan, helping to define the data to collect and the indicators to measure, thereby facilitating the assessment of program effectiveness.

Key Components of a Logic Model

  • Inputs: Resources invested in the program.
  • Activities: Actions taken to achieve the program's goals.
  • Outputs: Direct products of program activities.
  • Outcomes: Changes or benefits resulting from the program.

Resources for Developing a Logic Model

The Theory of Change

What is a Theory of Change?

A theory of change outlines the causal mechanisms showing why each intervention component is expected to result in the intended outcomes. Unlike logic models, theories of change are not always linear, as effects can later influence causes.

Key Differences Between Logic Model and Theory of Change

  • Structure: Logic models are typically linear, while theories of change can be more complex and non-linear.
  • Detail: Theories of change often include more detailed explanations of the causal mechanisms.

Resources for Developing a Theory of Change


Key Evaluation Planning Questions

High-Level Key Questions

When planning an evaluation, focus on answering a few high-level key questions to guide your process.

Resources for Developing Evaluation Questions

By utilizing these resources and frameworks, you can effectively plan, implement, and evaluate your programs to achieve desired outcomes and ensure continuous improvement.

The Soft Side of Evaluation: Management and Relationships

Effective evaluation involves more than just analyzing data; it also requires careful consideration of who will manage the evaluation and how relationships will be developed and maintained. Here's a guide to help you navigate these aspects:

Key Questions to Consider

  • Who should lead the evaluation?
  • How do I develop and maintain relationships with key stakeholders?
  • Should I commission specialist skills?

Governance and Management

It's crucial to address governance early in the evaluation process. Considerations include who will lead the evaluation and who should be involved to ensure its success. The following resources can help:

Evaluation Toolkit by the Pell Institute and Pathways to College Network

This toolkit includes a module to help you select an appropriate evaluation lead for any given context. It also provides resources for engaging key stakeholders.

Better Evaluation Guidance

Better Evaluation offers detailed guidance on deciding who will conduct the evaluation. This resource considers various actors, including external contractors, internal staff, service deliverers, peers, the community, or a combined group. This guidance is part of the Rainbow Framework, which organizes evaluation methods and processes with color coding for ease of use.

Better Evaluation also provides guidance on understanding and engaging stakeholders. Stakeholders, including primary intended users and others with an interest in the evaluation, form a critical group in any evaluation process.

Robert Wood Johnson Foundation Guide

The Robert Wood Johnson Foundation offers an excellent, detailed guide on early stakeholder engagement. "A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions" aims to support evaluators in identifying and engaging their stakeholders throughout the evaluation process.

These resources will help you effectively manage your evaluation and build strong, productive relationships with all stakeholders involved.


Looking After Evaluation Participants: Ethical and Legal Considerations

Ethical and legal considerations are essential before starting any evaluation. Addressing these issues ensures that everyone involved is treated fairly and appropriately, preventing unnecessary problems once the evaluation begins. Here are some key resources to guide you:

Health Research Authority (HRA) and NIHR INVOLVE Review Statement

The Health Research Authority (HRA) and the NIHR INVOLVE team have produced a concise review statement to guide researchers on planning and delivering public involvement in research. This document outlines key considerations in an easy-to-understand manner.

Newcastle University Guidance

Newcastle University offers comprehensive guidance on planning and conducting research involving human participants. While targeted at Newcastle University staff, this resource is valuable for anyone involving people in their evaluation.

Better Evaluation Guidance on Ethical and Quality Standards

Better Evaluation has developed a guide to help researchers understand and define ethical and quality standards in evaluations. This guide includes resources to help research teams meet the requirements of ethically sound and high-quality evaluations.

Better Evaluation Ethical Guidelines

Better Evaluation also provides a set of ethical guidelines covering the conduct of evaluations, including the responsibilities of those undertaking and managing evaluations. These guidelines link to additional resources for embedding ethical considerations at the earliest stage of evaluation planning.

These resources will help you ensure that your evaluation is conducted ethically and legally, safeguarding the rights and well-being of all participants.


Using What’s Out There: Leveraging Existing Data and Previous Evaluations

When planning an evaluation project, it's crucial to explore existing data and determine if similar evaluations have been conducted before. This can seem daunting, but there are resources available to guide you through the process of searching and utilizing relevant research.

Cochrane Network

Cochrane is dedicated to producing and disseminating high-quality evidence through systematic reviews and meta-analyses of healthcare interventions. Their Cochrane Training arm offers a handbook that includes valuable guidance on how to search and select studies, specifically in Chapter 4. Although the focus is on Cochrane reviews, the principles can be applied broadly to various types of research.

UK National Health Service (NHS) Guidance

The NHS has created easy-to-use guidance on literature searching, targeted at NHS researchers but useful for anyone. This guide helps you develop effective search strategies to find relevant studies.

University of Minnesota Guide

The University of Minnesota offers a clear guide using the PICO (Population, Intervention, Comparison, Outcome) framework. This resource is excellent for those new to literature searching and provides a step-by-step approach to using library resources.

PRISMA Expanded Checklist

For a more formal approach to reporting your literature review, the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) expanded checklist can be incredibly useful. PRISMA provides a structured framework for reporting systematic reviews and meta-analyses, ensuring transparency and completeness.

Why Use Existing Data?

Using existing data can save time and resources, provide a broader context for your evaluation, and enhance the reliability of your findings. Reviewing previous evaluations can also help you avoid duplicating efforts and identify best practices or common pitfalls.

Steps to Take

  1. Define Your Search Strategy: Use resources like Cochrane and NHS guides to develop a comprehensive search strategy.
  2. Identify Relevant Studies: Look for systematic reviews, meta-analyses, and other high-quality studies that align with your evaluation objectives.
  3. Utilize Frameworks: Apply frameworks such as PICO to organize your search and ensure you cover all necessary aspects.
  4. Report Systematically: Use the PRISMA checklist to ensure your review is thorough and well-documented.

By leveraging these resources, you can effectively use existing data and previous evaluations to inform and enhance your project.


Approaches to Gathering and Analyzing Evidence: How to Collect and Analyze Data

Planning for data collection and analysis in an evaluation can be challenging as the approach depends on the specific tasks and available data. Fortunately, the "Better Evaluation" initiative offers a range of resources to help with this planning process.

Describing Activities, Outcomes, and Impact

Better Evaluation provides guidance to help evaluators describe activities, outcomes, impacts, and the context in which an intervention is delivered. This resource supports tasks such as data sampling, using outcome measures, data collection, data management, and analysis.

Understanding Causes of Outcomes and Impacts

Understanding the causes of outcomes and impacts is critical in evaluations. Better Evaluation offers guidance on the various methods available for this purpose, including tools to help you interpret findings effectively.

Synthesizing Data

To provide a comprehensive overview, Better Evaluation has created a resource on synthesizing data from multiple evaluations. This helps to create a bigger-picture understanding by combining findings from various sources.


Key Steps in Data Collection and Analysis

  1. Define Your Objectives: Clearly define what you need to measure and why.
  2. Select Appropriate Methods: Choose data collection methods that align with your objectives and available resources.
  3. Data Sampling: Plan your sampling strategy to ensure representative and unbiased data collection.
  4. Use Outcome Measures: Identify and use appropriate outcome measures for your evaluation.
  5. Data Management: Implement robust data management practices to ensure data quality and integrity.
  6. Analyze Data: Use suitable techniques and software to analyze both quantitative and qualitative data.
  7. Synthesize Findings: Combine data from various sources to provide a comprehensive understanding of your evaluation.

By utilizing these resources and following these steps, you can effectively plan and execute the data collection and analysis for your evaluation project.


Communicating Findings: Strategies and Best Practices

Effectively communicating research findings is crucial to ensure that your evaluation has a meaningful impact. The resources in this section provide guidance on how to be an effective communicator, helping you reach and engage your intended audience.

Who Do I Need to Communicate With?

Identify your primary intended users and other stakeholders who will benefit from your evaluation findings. These may include:

  • Project Sponsors and Funders: Individuals or organizations that funded or supported the project.
  • Policy Makers and Decision Makers: Government officials and leaders who can implement changes based on your findings.
  • Community Members and Beneficiaries: Those directly impacted by the evaluation outcomes.
  • Researchers and Academics: Professionals who can use the findings for further research.
  • Media and Public: General public and media outlets that can help disseminate the information.

Best Ways to Reach Your Audience

Tailor your communication methods to suit the preferences and needs of each audience group. Consider the following strategies:

  • Reports and Executive Summaries: Detailed reports for funders and decision makers.
  • Presentations and Workshops: Interactive sessions for community members and stakeholders.
  • Academic Publications and Conferences: Disseminate findings through journals and academic events.
  • Press Releases and Social Media: Engage the public and media with accessible summaries and updates.

Types of Information to Share

Different audiences respond to different types of information:

  • Quantitative Data: Charts, graphs, and statistics for policy makers and researchers.
  • Narrative Stories and Case Studies: Real-life examples and testimonials for community members and media.
  • Actionable Recommendations: Clear, concise suggestions for decision makers and project sponsors.

Resources for Effective Communication

Better Evaluation: Reporting & Supporting the Use of Findings

Better Evaluation offers guidance on how to share evaluation findings effectively with primary intended users and other stakeholders. The guide emphasizes tailoring the reporting process to meet the needs and learning gaps of the audience.

Health Foundation: Communicating Research Findings Toolkit

The Health Foundation provides an interactive web toolkit on communicating research findings. This resource is user-friendly and covers various aspects of effective communication.


Acting on Findings: Implementing Evaluation Results for Success

Implementing findings from an evaluation is a critical step that can be challenging due to various factors, such as proper curation of findings and staff turnover. The following resources can help you navigate these challenges and ensure successful implementation.

Document Management Processes and Agreements

Better Evaluation offers guidance on creating document management processes and agreements. These tools help stakeholders document decisions about managing evaluative activities and monitor compliance with ethical and quality standards, ensuring the sustainability and success of the findings.

Effective Implementation of Learnings

The Bruner Foundation's Effectiveness Initiative provides a concise yet comprehensive document on using evaluation findings to enable effective implementation. This resource is packed with valuable information and strategies to help ensure that lessons learned from evaluations are not forgotten and are put into practice.