Search form

Learn how providing feedback can help track activities and accomplishments that may lead to longer-term outcomes, usually referred to as monitoring.


This section is based on an article in the Work Group Evaluation Handbook: Evaluating and Supporting Community Initiatives for Health and Development by Stephen B. Fawcett, Adrienne Paine-Andrews, Vincent T. Francisco, Jerry Schultz, Kimber P. Richter, R.K. Lewis, E.L. Williams, K.J. Harris, Jannette Berkley, Jacqueline L. Fisher, and Christine M. Lopez of the Work Group on Health Promotion and Community Development, University of Kansas, Lawrence, Kansas.

  • What is providing feedback?

  • Why provide feedback?

  • When should you provide feedback?

  • How should you provide feedback?

What is providing feedback?

When we talk about providing feedback from your evaluation, we mean presenting the data on accomplishments that you're getting from your evaluation to those involved in the initiative – line staff and volunteers, as well as administrators and board members. Depending on the methods of the evaluation and the type of data involved, some of it may be presented in the form of tables and graphs, some as narrative, and some perhaps as portfolios or even audio or videotape.

Feedback is a two-way street. The folks engaged in the day-to-day operation of the organization and its programs can benefit from learning what seems to be working well and what doesn’t, and from understanding what needs to be changed to improve results. At the same time, administrators and board members need to understand where the weaknesses in the organization are that keep staff and volunteers from being as effective as possible.

Is there enough support? Are there enough supplies? Is funding adequate to get the work done? Is the organizational climate one of mutual respect that’s conducive to staff and volunteers doing their best, and to participants feeling welcomed and supported? These are questions that staff and volunteers – and participants – can answer at the same time that the organization grapples with monitoring results. To be really useful, feedback has to travel in both directions.

Providing feedback should be done on an ongoing basis so that all can be kept up-to-date on what they're doing well and what can stand improvement. It can also be done at the end of an evaluation (although not exclusively so).

In this section, we'll mainly be discussing the tracking of activities and accomplishments that may lead to longer-term outcomes, usually referred to as monitoring.

Why provide feedback?

Here are a few reasons to consider providing feedback from time to time:

  • To help community leadership assess progress towards meeting the initiative's goals
  • To help see areas wherein the members of the initiative may want to put more energy
  • To help detect when too much energy or effort is spent in areas less central to the mission
  • To provide the opportunity to celebrate small accomplishments
  • To help the initiative focus on the "big picture" by seeing cumulative accomplishments over time
  • To provide funders the opportunity to help re-direct the initiative towards activities more directly related to the mission
  • To provide funders the opportunity to see and reward the accomplishments of the initiative

When should you provide feedback?

Your staff should get feedback at regular intervals, especially early in the initiative's development, so that they can continually adjust their efforts to improve the initiative's results. You might want to have feedback sessions once a month for the first year and then after that do them on a quarterly (four times a year) basis. Giving people feedback often gives them the chance to see what changes have resulted from adjustments they made after the last feedback session, so that you can all work together to continuously improve your efforts.

How should you provide feedback?

Begin with an overall statement summing up how the initiative is doing

This statement should be affirming – you don't want to start in on areas that need to be approved right off the bat – and it should be very specific about what is going well. For example, "We've received 35 letters from parents praising the school safety committee's work at Uptown Middle School" is a much more specific statement than "A lot of parents are pleased with the school safety committee."

Perhaps the data you've gathered will indicate that everything your group has been doing is perfect. If this is the case, presenting feedback to your group will probably be a breeze. However, it's more likely that you'll have at least a few areas, or even – perish the thought – a lot of areas that need improvement. You want the people involved in your initiative to understand what they could be doing better, but at the same time you will have to be very careful to keep them from feeling discouraged, angry at themselves, resentful, or insulted by less-than-glowing feedback. How do you do this?

Present the data to the group as a good thing

It's important to present the feedback as a good thing, even if the feedback isn't all that positive - after all, you're giving them information that will help them do a better job. Try to communicate the value of a group that really wants to know how it's doing. Show them how using data gathered for your evaluation can help them do that. Provide a shared vision of the initiative as a catalyst for change: "We can make a difference and we will make a difference, and here's how we can do it!" Communicating your optimism that the group will ultimately succeed in having an impact is very important. In other words, start off with the good news, and then present any bad news in a way that encourages everyone to work hard to change it.

Show graphs

We suggest doing it in this order:

  • Process Measures - number of people involved with program, member satisfaction, amount of activity to make changes in the community relative to the groups goals
  • Intermediate Outcomes - the number of changes in the community (e.g., new or modified programs, policies, practices), information about the context of the initiative and critical events
  • Ultimate Outcomes/Impact Data - data from surveys of behavior change among people targeted in the intervention, community-level impact data

Show non-graphic material

As we mentioned above, there may be a good deal of material – narratives, examples of participant work or learning, participant comments – that isn’t best provided in graphic form. This kind of material also needs to be discussed differently, in that what it communicates may not be as clear-cut as the data presented in graphs.

Like graphic data, non-graphic data can be examined in relation to process, intermediate outcomes, and long-term impact. Unlike graphic data, it might take some creativity to understand the trends or conclusions it demonstrates. (A trend is movement in a given direction that isn’t yet definite or far-reaching enough to be definitely said to have reached or failed to reach a goal.)

Much of this data might be best understood in discussion. What specific actions, methods, or conditions do participant comments refer to, and how could those be changed, if they need to be? What does participant work show about the effectiveness of your work? What do anecdotes illustrate about challenges, and about things that need to be adjusted or eliminated?

Most of this is qualitative data, data that can’t or shouldn’t (because it would lose the subtlety of what it can tell you) be expressed in numerical form.

Present the information

Here are some tips on how to do this:

  • Begin with positive information
  • Start by asking the group how they would interpret the data - if your interpretation is different, or if there are differing ideas among the group members, discuss why that might be so, and how to resolve the differences.
  • Give a more detailed example from accomplishments that have happened recently – within the last few months, if possible
  • Point out what is positive about the data
  • Discuss with the group any trends in the data, and what those trends mean
  • Discuss any suggested re-direction of efforts
  • Review examples listed on the graph or in the qualitative information
  • Affirm the initiative by noting evidence of progress
  • Discuss any comments or questions group members may still have

If the measure doesn't show recent growth:

  • Have the group discuss, using the data, why they think this might be so
  • Point out any previous times that growth occurred to keep folks from getting discouraged - many of these measures will be cyclical in nature – resources generated, for example, often are recorded in spurts associated with funding deadlines. If it seems appropriate to point this out, do so.
  • Ask the group for ideas about how to adjust the work in order to address the issue, if it needs to be addressed
  • Ask if there were additional activities that were not recorded. List items that have been recorded over the life of the initiative. You may find things that didn't get recorded, and this could make a big difference in the results.

Summarize the data by discussing strengths of the initiative

End on a positive note! You want your staff and volunteers to come away from this presentation with the resolve to work hard to improve the initiative, and you want to bolster their confidence that they can accomplish this – so don't let them leave feeling bad about their efforts up until now. Review the graphs, and be sure to show the graph most directly related to the mission of the project (most likely, this will be the graph of community change).

In Summary

Let's recap the main things to keep in mind when you prepare feedback for your staff:

  • Present this data as a gift, informing your audience about how the initiative is doing
  • Focus on the positive – be affirming!
  • Convey optimism about the prospects for success (if this is appropriate)
  • Convey the need for changes or adjustment (if this is appropriate)
  • Convey a shared vision of the initiative as an effective catalyst for change

Remember to keep it positive and give lots of encouragement to the people working on the initiative. Feedback sessions are meant to make your staff and volunteers want to work harder and make the initiative better, not to dishearten them. Good luck!

Chris Hampton

Online Resources

Not As Easy As It Seems: The Challenges and Opportunities to Build Community Capacity to Use Data for Decisions and Solutions is from Community Change, Creating Social Change with Knowledge. To achieve success, communities have to be able to access, share, and transform data into actionable knowledge.

Print Resources

Fawcett, S., in collaboration with Francisco, V., Paine, A., Lewis, R., Richter, K., Harris, K., Williams, E., Berkley, J., Schultz, J., Fisher, J., & Lopez, C. (1993). Work group evaluation handbook: Evaluating and supporting community initiatives for health and development. Lawrence, KS: Work Group on Health Promotion and Community Development, University of Kansas.

Fawcett, S., Sterling, T., Paine, A., Harris, K., Francisco, V., Richter, K., Lewis, R., & Schmid, T. L. (1995). Evaluating community efforts to prevent cardiovascular diseases. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion.

Francisco, V., Paine, A., & Fawcett, S. (1993). A methodology for monitoring health action coalitions. Health Education Research: Theory and Practice, 8 (3), 403-416.

Morris, L., Fitz, C., & Freeman, M. (1987). How to communicate evaluation findings. Newbury Park, CA: Sage Publications.