Utilization-Focused Evaluation (UFE), developed by Michael Quinn Patton, is an approach based on the principle that an evaluation should be judged on its usefulness to its intended users.
Therefore evaluations should be planned and conducted in ways that enhance the likely utilization of both the findings and of the process itself to inform decisions and improve performance.
UFE has two essential elements. Firstly, the primary intended users of the evaluation must be clearly identified and personally engaged at the beginning of the evaluation process to ensure that their primary intended uses can be identified. Secondly, evaluators must ensure that these intended uses of the evaluation by the primary intended users guide all other decisions that are made about the evaluation process.
Rather than a focus on general and abstract users and uses, UFE is focused on real and specific users and uses. The evaluator’s job is not to make decisions independently of the intended users, but rather to facilitate decision making amongst the people who will use the findings of the evaluation.
Patton argues that research on evaluation demonstrates that: “Intended users are more likely to use evaluations if they understand and feel ownership of the evaluation process and findings [and that] [t]hey are more likely to understand and feel ownership if they've been actively involved. By actively involving primary intended users, the evaluator is preparing the groundwork for use.” (Patton, 2008, Chapter 3)
UFE can be used for different types of evaluation (formative, summative, process, impact) and it can use different research designs and types of data.
The UFE framework can be used in a variety of ways depending on the context and the needs of the situation. Patton's original framework consisted of a 5-step process which can be seen in the example below. However, there is also a 12-step framework (see Utilization-Focused Evaluation (U-FE)Checklist in the resources below) and the latest update, which is 17 steps, is outlined below.
In late 2006, the International Network for Bamboo and Rattan (INBAR) engaged one of the authors (Horton) to evaluate its programmes. Headquartered in Beijing, INBAR's mission is to improve the wellbeing of bamboo and rattan producers and users while ensuring the sustainability of the bamboo and rattan resource base. The Dutch Government had initially requested and funded the evaluation as an end-of-grant requirement.
The first task was to ascertain the 'real' purposes and potential users of the evaluation. This process began with a face-to-face meeting with INBAR's Director and a call to a desk officer at the Dutch Ministry of Foreign Affairs, which revealed that the intent of both parties was for the evaluation to contribute to strengthening INBAR's programmes and management. During an initial visit to INBAR's headquarters, additional stakeholders were identified, including INBAR board members and local partners.
From the outset, it was clear that key stakeholders were committed to using the evaluation to improve INBAR's work. So the main task was to identify key issues for INBAR's organizational development. Three options were used: (1) a day-long participatory staff workshop to review INBAR's recent work and identify main strengths, weaknesses and areas for improvement; (2) interviews with managers and staff members; and (3) proposing a framework for the evaluation that covered the broad areas of strategy, management systems, programmes and results.
After early interactions with the Dutch Ministry of Foreign Affairs on the evaluation Terms of Reference (ToR), most interactions were with INBAR managers, staff members and partners at field sites. It was jointly decided that INBAR would prepare a consolidated report on its recent activities (following an outline proposed by the evaluator) and organize a self-evaluation workshop at headquarters. The evaluator would participate in this workshop and make field visits in China, Ghana, Ethiopia and India. INBAR regional coordinators proposed schedules for the field visits, which were then negotiated with the evaluator.
At the end of each field visit, a debriefing session was held with local INBAR staff members. At the end of the field visits, a half-day debriefing session and discussion was held at INBAR headquarters; this was open to all staff. After this meeting, the evaluator met with individual staff members who expressed a desire to have a more personal input into the evaluation process. Later on, INBAR managers and staff members were invited to comment on and correct a draft evaluation report.
The evaluator met personally with representatives of three of INBAR's donors to discuss the evaluation's findings, and the final report was made available to INBAR's donors, staff members and the Board of Trustees. A summary of the report was posted on the INBAR website.
The evaluation process helped to bring a number of issues to the surface and explore options for strengthening INBAR's programmes. For example, one conclusion of the evaluation was that INBAR should seek to intensify its work in Africa and decentralize responsibilities for project management to the region. There has been a gradual movement in this direction, as new projects have been developed. INBAR has recently opened a regional office for East Africa, in Addis Ababa and is putting more emphasis on collaboration with regional and national partners.
(Patton, M.Q., & Horton, D., 2009)
This book, authored by Ricardo Ramírez and Dal Brodhead, is designed to support evaluators and program managers implement Utilization-focused evaluation (UFE).
Useful for practitioners and students alike this book is both theoretical and practical. Features include follow-up exercises at the end of each chapter and a utilization-focused evaluation checklist.
Drawing on Michael Quinn Patton's best-selling Utilization-Focused Evaluation, this book provides an overall framework and essential checklist steps for designing and conducting evaluations that actually get used.
Drawn from Michael Quinn Patton's book Utilization-focused evaluation, this paper introduces this approach to evaluation, outlines key steps in the evaluation process, identifies some of the main benefits of Utilization-Focused Evaluation (UFE), and pro
Composed by Michael Quinn Patton in 2002 and updated in 2013, this is a comprehensive checklist for undertaking a utilisation-focused evaluation.
This presentation from the Strengthening ICTD Research Capacity in Asia (SIRCA) program provides an overview of how UFE was used in their programme. It was presented at the Evaluation Conclave 2010, New Delhi, India
This case study describes how The Information Society Innovation Fund (ISIF) used Utilisation Focused Evaluation in conjunction with a project with the Developing Evaluation Capacity in ICTD (DECI) programme.
Patton, M.Q. (2008). Utilization-focused evaluation, 4th edition. Thousand Oaks, CA: Sage.
Patton, M.Q. and Horton, D. (2009). Utilization-Focused Evaluation for Agricultural Innovation. Institute of Learning and Change (ILAC) Brief No. 22. ILAC, Bioversity, Rome.
Last updated: 06 November 2021