Search form

Making Outcomes Matter

OVERVIEW AND EVIDENCE BASE

What do we mean by this process?

Making Outcomes Matter is a process of using incentives and disincentives to promote sustained activities that lead to change and improvement. This process integrates regular updates on project accomplishments and needed adjustments into assessments of progress, accountability, and co-learning. When groups engage in Making Outcomes Matter, they agree to (a) Organize and review interim evidence of progress and impact (e.g., community and system change; community-level indicators of health and development), and (b) assess findings with partners to determine next steps for continuous improvement. Procedures to put this process into action can range from informal practices (e.g., marketing information and data to increase its use in decision making) to formal contingencies on behavior (e.g., grant installments based on achievement of a minimum number of programs or changes delivered each quarter).

Making Outcomes Matter is a key process to help communities create and change conditions for behavior change and population-level improvements.

How it works

Making Outcomes Matter is a process that uses feedback on progress and differential rewards (i.e., incentives and disincentives) for change and improvement. It is not simply about "accountability" - a thumbs-up or thumbs-down final assessment of the merit of the effort. Rather, the process of Making Outcomes Matter occurs over the lifespan of an initiative, and aims to use information about progress to prompt action and make adjustments. Feedback --- on levels of change and improvement and incentives for progress --- are used to enhance levels of organizational capacity, implementation, and community change and population-level improvements (Fawcett, Francisco, Paine-Andrews, & Schultz, 2000; Paine Andrews, Francisco, & Fawcett, 1994). This process works best when linked to other key processes that help to (a) outline the conceptual roadmap and indicators for change (Developing a Framework or Model of Change) and (b) measure and understand what an initiative is doing (Documenting Progress and Using Feedback).

Although the exact mechanisms for Making Outcomes Matter are uncertain, participation in this process may help groups to:

  • Clarify and enhance the strength of their community initiative by examining and learning from the extent of actual implementation (Walker & Grossman, 1999).
  • Identify and evaluate a set of operational benchmarks by which other initiatives can assess progress (Walker & Grossman, 1999). For example, it may turn out that building capacity (e.g., staff and board development) may be a critical activity that is worth defining and measuring (and one that comes before examining the particular impact of the initiative on environmental change and population-level outcomes).
  • Reward and inspire consistent patterns of community change related to the mission (e.g., award renewal contingent on evidence of progress, bonus grants for outstanding accomplishments, "outcome dividends" calculated on cost-benefit estimates associated with improvements) (e.g., Fawcett, Francisco, Paine-Andrews, & Schultz, 2000; Gerry, Fawcett, & Richter, 1996; Roussos & Fawcett, 2000).
  • Increase rates of community change by focusing and basing decisions on clear benchmarks for action and evidence of progress (Fawcett et al., 1997; Roussos & Fawcett, 2000).
  • Increase funding support and other investments by regularly communicating evidence of progress (e.g., Roussos & Fawcett, 2000).

Empirical and Experiential Evidence

Linking resources to progress in implementation, levels of change, and improvement on outcomes can have powerful effects. For example, with a substance use coalition known as Project Freedom, the rate of community change (i.e., new programs, policies, and practices) facilitated by the effort was rather modest. Following announcement by a grantmaker that annual renewal of the groups' funding would be based, in part, on the changes brought about by the effort, the rate of community changes increased dramatically (Fawcett et al., 1997). Similarly, the Ewing Marion Kauffman Foundation incorporated reporting on community outcomes into their funding arrangement with communities. As a result, and after visioning and leadership training, subsequent community planning, implementation, and documentation efforts focused on accomplishing change (Kreuter et al., 2000). Sites used regular evaluation data on community changes to reveal progress and target areas not yet showing desired changes. In addition, they were able to use evaluation data to seek more funding and sustain ongoing programming. The process of linking resources with funding was successful after both parties understood the aim and procedures for documenting evidence of work toward achieving outcomes.

Similarly, United Way of American shifted from a fundraising organization to a community impact organization that is focused on improving community outcomes (Essential Attributes, 2003). Outcome-focused planning ("name and frame impact") helps initiatives align data about various issues with current activities to address challenges that may impede change. Then, by focusing meetings, action teams, and other implementation and sense-making steps on indicators and results, groups focus efforts on achieving meaningful impact on community -dentified concerns. For example, drop-out rates have been cut in half through a multi-sector partnership that changed the way community systems deal with truancy and student alienation. In Wichita, Kansas, almost 4000 uninsured people have enrolled to receive coordinated, low-cost health care through Project Access.

In the state of Vermont, when a state agency and community members infused data regarding alcohol-related motor vehicle deaths into multiple levels of community decision-making, they saw a drastic improvement over time (Hogan & Murphey, 2002). Vermont went from ranking worst (1996 and 1997) to second best (1999) in the nation in the proportion of teen crash deaths that involved alcohol. When media coverage, community members' prevention activities, and legislative bodies (governor, departments of safety and public safety) instigated a widespread conversation about this indicator, they helped make outcomes matter. Data in media profiles of alcohol-related teen deaths and other public conversations resulted in incremental material resources, community interception to break up teen drinking parties, and improved enforcement, court referrals, and stiffened sanctions for alcohol possession. In summary, the presence of contingencies such as material rewards and social consequences can affect the array of activities associated with change and improvement.

Implications for Research and Practice

At present, much of the information available on Making Outcomes Matter does not explicitly manipulate or test this process and its effects on community change and improvement. Although this process has been identified as a key ingredient for advancing change, there is a need for more systematic evaluations of its implementation and effects. Such research would provide a better understanding of the factors that enable communities to come together and address shared problems and goals.

Some key research questions include: (a) What criteria are used to decide whether an effort merits continued or enhanced support? (b) How does the evaluation contribute to the sustainability and institutionalization of the effort and its core components? (c) What combination of material rewards (e.g., bonus grants contingent on progress) and social consequences (e.g., media promotions of success stories) are associated with increased community change and improvement?

Overall Recommendation for Practice

Based on research and experience, we recommend (with qualifications) Making Outcomes Matter as a key process to advance community change and improvement. The process is suggested by many researchers and practitioners as an important tool for designing, strengthening, and evaluation community change and improvement interventions. Yet, the prevalence of empirical and experiential evidence regarding its effectiveness is at an early stage compared to what is known about other Best Processes.