Welcome to the Evaluation Resource Library! Whether you are currently involved in evaluation work or preparing to begin, this curated collection is designed to meet your needs. Organized as an evaluation pathway, it offers a comprehensive range of resources to guide and support you at every stage.
Explore the pathway to find resources covering topics such as managing evaluations, mastering data collection, engaging with service users, and effectively communicating and implementing findings.
If you're unsure where to begin or feeling uncertain, simply select the tab that aligns with your current needs to access resources tailored to your specific challenges.
And if you still find yourself in need of personalized assistance, please check our webinars or book a drop-in clinic session with Professor Andy Jones who is a public health academic with many years of experience undertaking public health evaluations.
Other Evaluation Services Offered by Regional Experts:
Don't forget these additional evaluation services offered by professional teams across the region:
1. Health Innovation East: Supporting Innovators in Measuring Impact through Real-world Evaluation
2. NICHE Anchor Institute, University of East Anglia: Evaluating the Anchor Institute in the East of England
3. Public Health Intervention Responsive Studites Team (PHIRST) - East of England
4. University of Essex: Service Evaluation Support to health and care providers
If you're looking to understand how to undertake an evaluability assessment, this guide is for you. An evaluability assessment helps determine if an evaluation is appropriate and, if so, how in-depth it should be.
About the Guide
This guide is created by the Methods Lab, an action-learning collaboration between the Overseas Development Institute (ODI), Better Evaluation (BE), and the Australian Department of Foreign Affairs and Trade (DFAT). It provides an overview of evaluability assessment and its use in impact evaluation. You will find:
Planning for an Evaluability Assessment: Insight into the planning process.
Checklists and Decision Support Tools: Useful tools to guide you through the assessment.
Post-Assessment Actions: What to do once the assessment is completed.
Lessons Learned: Insights from practical experiences with evaluability assessments.
You can download the guide.
Additional Resources
UK Department for International Development Checklist
The UK Department for International Development has produced a simple tabular Evaluability Checklist. This checklist can be used as a standalone tool or incorporated into an evaluability assessment, with results summarized in a substantial written report. Download the checklist. This checklist is from pages 19-23 of the report:
Davies, R. (2013). Planning Evaluability Assessments: A Synthesis of the Literature with Recommendations. Report commissioned by the Department for International Development.
UNODC Evaluability Assessment Template
The United Nations Office on Drugs and Crime (UNODC) has created an Evaluability Assessment template. This template guides users through a simple set of steps to decide whether to evaluate an intervention. While it does not cover detailed evaluation considerations, it is a useful quick decision support tool.
These resources will help you effectively determine the appropriateness and depth of your evaluation, ensuring a comprehensive and informed approach to your projects.
What is an Evaluation Framework?
An evaluation framework is a structured and systematic approach used to assess and analyze various aspects of a project, program, policy, product, or process. It provides a foundation for designing an evaluation plan by specifying inputs, activities, and expected outcomes. Two commonly used frameworks are the Logic Model and the Theory of Change.
The Logic Model
What is a Logic Model?
A logic model is a visual representation and systematic tool used to illustrate the relationships between various components of a program, project, or intervention. It serves as a foundation for designing an evaluation plan, helping to define the data to collect and the indicators to measure, thereby facilitating the assessment of program effectiveness.
Key Components of a Logic Model
- Inputs: Resources invested in the program.
- Activities: Actions taken to achieve the program's goals.
- Outputs: Direct products of program activities.
- Outcomes: Changes or benefits resulting from the program.
Resources for Developing a Logic Model
- W.K. Kellogg Foundation Logic Model Guide: Provides a clear introduction to the principles and language of the program logic model, including blank templates for developing your own logic models.
- Early Intervention Foundation Logic Model Template: An editable template available as part of the “Evaluation Hub” microsite.
The Theory of Change
What is a Theory of Change?
A theory of change outlines the causal mechanisms showing why each intervention component is expected to result in the intended outcomes. Unlike logic models, theories of change are not always linear, as effects can later influence causes.
Key Differences Between Logic Model and Theory of Change
- Structure: Logic models are typically linear, while theories of change can be more complex and non-linear.
- Detail: Theories of change often include more detailed explanations of the causal mechanisms.
Resources for Developing a Theory of Change
- UNICEF Theory of Change Guide by Patricia Rogers: Offers an excellent introduction to the components and development methodology of a theory of change. It also highlights its use for identifying data to collect in an evaluation.
- Early Intervention Foundation Theory of Change Template: Similar to their logic modeling template, this can be completed online. Visit the Evaluation Hub.
Key Evaluation Planning Questions
High-Level Key Questions
When planning an evaluation, focus on answering a few high-level key questions to guide your process.
Resources for Developing Evaluation Questions
- Early Intervention Foundation’s “10 Steps for Evaluation Success”: Describes how evaluation evidence can turn a good idea into an effective intervention and develop quality assurance systems.
- Good Evaluation Questions Guide by CDC: A checklist to aid in creating and reviewing evaluation questions.
- Evaluation Questions Checklist for Program Evaluation: Developed by the Evaluation Checklist Project to aid in developing and assessing evaluation questions.
- Pell Institute and Pathways to College Network Evaluation Toolkit: Helps in identifying evaluation goals and selecting appropriate questions.
By utilizing these resources and frameworks, you can effectively plan, implement, and evaluate your programs to achieve desired outcomes and ensure continuous improvement.
Effective evaluation involves more than just analyzing data; it also requires careful consideration of who will manage the evaluation and how relationships will be developed and maintained. Here's a guide to help you navigate these aspects:
Key Questions to Consider
- Who should lead the evaluation?
- How do I develop and maintain relationships with key stakeholders?
- Should I commission specialist skills?
Governance and Management
It's crucial to address governance early in the evaluation process. Considerations include who will lead the evaluation and who should be involved to ensure its success. The following resources can help:
Evaluation Toolkit by the Pell Institute and Pathways to College Network
This toolkit includes a module to help you select an appropriate evaluation lead for any given context. It also provides resources for engaging key stakeholders.
Better Evaluation Guidance
Better Evaluation offers detailed guidance on deciding who will conduct the evaluation. This resource considers various actors, including external contractors, internal staff, service deliverers, peers, the community, or a combined group. This guidance is part of the Rainbow Framework, which organizes evaluation methods and processes with color coding for ease of use.
Better Evaluation also provides guidance on understanding and engaging stakeholders. Stakeholders, including primary intended users and others with an interest in the evaluation, form a critical group in any evaluation process.
Robert Wood Johnson Foundation Guide
The Robert Wood Johnson Foundation offers an excellent, detailed guide on early stakeholder engagement. "A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions" aims to support evaluators in identifying and engaging their stakeholders throughout the evaluation process.
These resources will help you effectively manage your evaluation and build strong, productive relationships with all stakeholders involved.
Ethical and legal considerations are essential before starting any evaluation. Addressing these issues ensures that everyone involved is treated fairly and appropriately, preventing unnecessary problems once the evaluation begins. Here are some key resources to guide you:
Health Research Authority (HRA) and NIHR INVOLVE Review Statement
The Health Research Authority (HRA) and the NIHR INVOLVE team have produced a concise review statement to guide researchers on planning and delivering public involvement in research. This document outlines key considerations in an easy-to-understand manner.
Newcastle University Guidance
Newcastle University offers comprehensive guidance on planning and conducting research involving human participants. While targeted at Newcastle University staff, this resource is valuable for anyone involving people in their evaluation.
Better Evaluation Guidance on Ethical and Quality Standards
Better Evaluation has developed a guide to help researchers understand and define ethical and quality standards in evaluations. This guide includes resources to help research teams meet the requirements of ethically sound and high-quality evaluations.
Better Evaluation Ethical Guidelines
Better Evaluation also provides a set of ethical guidelines covering the conduct of evaluations, including the responsibilities of those undertaking and managing evaluations. These guidelines link to additional resources for embedding ethical considerations at the earliest stage of evaluation planning.
These resources will help you ensure that your evaluation is conducted ethically and legally, safeguarding the rights and well-being of all participants.
When planning an evaluation project, it's crucial to explore existing data and determine if similar evaluations have been conducted before. This can seem daunting, but there are resources available to guide you through the process of searching and utilizing relevant research.
Cochrane Network
Cochrane is dedicated to producing and disseminating high-quality evidence through systematic reviews and meta-analyses of healthcare interventions. Their Cochrane Training arm offers a handbook that includes valuable guidance on how to search and select studies, specifically in Chapter 4. Although the focus is on Cochrane reviews, the principles can be applied broadly to various types of research.
UK National Health Service (NHS) Guidance
The NHS has created easy-to-use guidance on literature searching, targeted at NHS researchers but useful for anyone. This guide helps you develop effective search strategies to find relevant studies.
University of Minnesota Guide
The University of Minnesota offers a clear guide using the PICO (Population, Intervention, Comparison, Outcome) framework. This resource is excellent for those new to literature searching and provides a step-by-step approach to using library resources.
PRISMA Expanded Checklist
For a more formal approach to reporting your literature review, the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) expanded checklist can be incredibly useful. PRISMA provides a structured framework for reporting systematic reviews and meta-analyses, ensuring transparency and completeness.
Why Use Existing Data?
Using existing data can save time and resources, provide a broader context for your evaluation, and enhance the reliability of your findings. Reviewing previous evaluations can also help you avoid duplicating efforts and identify best practices or common pitfalls.
Steps to Take
- Define Your Search Strategy: Use resources like Cochrane and NHS guides to develop a comprehensive search strategy.
- Identify Relevant Studies: Look for systematic reviews, meta-analyses, and other high-quality studies that align with your evaluation objectives.
- Utilize Frameworks: Apply frameworks such as PICO to organize your search and ensure you cover all necessary aspects.
- Report Systematically: Use the PRISMA checklist to ensure your review is thorough and well-documented.
By leveraging these resources, you can effectively use existing data and previous evaluations to inform and enhance your project.
Planning for data collection and analysis in an evaluation can be challenging as the approach depends on the specific tasks and available data. Fortunately, the "Better Evaluation" initiative offers a range of resources to help with this planning process.
Describing Activities, Outcomes, and Impact
Better Evaluation provides guidance to help evaluators describe activities, outcomes, impacts, and the context in which an intervention is delivered. This resource supports tasks such as data sampling, using outcome measures, data collection, data management, and analysis.
Understanding Causes of Outcomes and Impacts
Understanding the causes of outcomes and impacts is critical in evaluations. Better Evaluation offers guidance on the various methods available for this purpose, including tools to help you interpret findings effectively.
Synthesizing Data
To provide a comprehensive overview, Better Evaluation has created a resource on synthesizing data from multiple evaluations. This helps to create a bigger-picture understanding by combining findings from various sources.
Key Steps in Data Collection and Analysis
- Define Your Objectives: Clearly define what you need to measure and why.
- Select Appropriate Methods: Choose data collection methods that align with your objectives and available resources.
- Data Sampling: Plan your sampling strategy to ensure representative and unbiased data collection.
- Use Outcome Measures: Identify and use appropriate outcome measures for your evaluation.
- Data Management: Implement robust data management practices to ensure data quality and integrity.
- Analyze Data: Use suitable techniques and software to analyze both quantitative and qualitative data.
- Synthesize Findings: Combine data from various sources to provide a comprehensive understanding of your evaluation.
By utilizing these resources and following these steps, you can effectively plan and execute the data collection and analysis for your evaluation project.
Effectively communicating research findings is crucial to ensure that your evaluation has a meaningful impact. The resources in this section provide guidance on how to be an effective communicator, helping you reach and engage your intended audience.
Who Do I Need to Communicate With?
Identify your primary intended users and other stakeholders who will benefit from your evaluation findings. These may include:
- Project Sponsors and Funders: Individuals or organizations that funded or supported the project.
- Policy Makers and Decision Makers: Government officials and leaders who can implement changes based on your findings.
- Community Members and Beneficiaries: Those directly impacted by the evaluation outcomes.
- Researchers and Academics: Professionals who can use the findings for further research.
- Media and Public: General public and media outlets that can help disseminate the information.
Best Ways to Reach Your Audience
Tailor your communication methods to suit the preferences and needs of each audience group. Consider the following strategies:
- Reports and Executive Summaries: Detailed reports for funders and decision makers.
- Presentations and Workshops: Interactive sessions for community members and stakeholders.
- Academic Publications and Conferences: Disseminate findings through journals and academic events.
- Press Releases and Social Media: Engage the public and media with accessible summaries and updates.
Types of Information to Share
Different audiences respond to different types of information:
- Quantitative Data: Charts, graphs, and statistics for policy makers and researchers.
- Narrative Stories and Case Studies: Real-life examples and testimonials for community members and media.
- Actionable Recommendations: Clear, concise suggestions for decision makers and project sponsors.
Resources for Effective Communication
Better Evaluation: Reporting & Supporting the Use of Findings
Better Evaluation offers guidance on how to share evaluation findings effectively with primary intended users and other stakeholders. The guide emphasizes tailoring the reporting process to meet the needs and learning gaps of the audience.
Health Foundation: Communicating Research Findings Toolkit
The Health Foundation provides an interactive web toolkit on communicating research findings. This resource is user-friendly and covers various aspects of effective communication.
Implementing findings from an evaluation is a critical step that can be challenging due to various factors, such as proper curation of findings and staff turnover. The following resources can help you navigate these challenges and ensure successful implementation.
Document Management Processes and Agreements
Better Evaluation offers guidance on creating document management processes and agreements. These tools help stakeholders document decisions about managing evaluative activities and monitor compliance with ethical and quality standards, ensuring the sustainability and success of the findings.
Effective Implementation of Learnings
The Bruner Foundation's Effectiveness Initiative provides a concise yet comprehensive document on using evaluation findings to enable effective implementation. This resource is packed with valuable information and strategies to help ensure that lessons learned from evaluations are not forgotten and are put into practice.