استمارة البحث

Section 4. Communicating Information to Funders for Support and Accountability

Learn how to effectively communicate information and research results to funders and to a public audience.

 

This section is based on an article in the Work Group Evaluation Handbook: Evaluating and Supporting Community Initiatives for Health and Development by Stephen B. Fawcett, Adrienne Paine-Andrews, Vincent T. Francisco, Jerry Schultz, Kimber P. Richter, R.K. Lewis, E.L. Williams, K.J. Harris, Jannette Berkley, Jacqueline L. Fisher, and Christine M. Lopez of the Work Group on Health Promotion and Community Development, University of Kansas, Lawrence, Kansas.

  • Why inform people about your evaluation findings?

  • What are some key audiences for the data?

  • How do you communicate evaluation findings?

Why inform people about your evaluation findings?

There are many reasons you should inform people about your evaluation findings. For one thing, you want the name and purpose of your coalition or initiative to be recognizable to your community, and you want them to have a positive impression of you. Releasing your evaluation findings not only lets people know you exist, but it also lets them know some of what you've been doing to help your community. If the findings are positive, it shows that you've been getting results. If they are less than encouraging, you can use this information to make the case that your group or coalition needs more public support. Sharing your evaluation findings should also stir public interest, and provoke thinking and discussion about the issues you're working on. The main reasons for making your evaluation findings known publicly are to expose the issue and encourage the public to take action.

There are three main levels of the public that you can tell about your evaluation findings: local, regional/state, and national. Some reasons for sharing your evaluation results with each of these levels:

Reasons for informing the public at the local level

  • To help raise awareness about the issue
  • To help attract volunteers, funding, and in-kind resources from local concerned citizens and agencies
  • To promote awareness of the efforts of volunteers and collaborators
  • To help lobby for local ordinances or program changes to address issues of concern
  • To provide accountability to the community, trustees, and funders

Reasons for informing the public at the state level

  • To create a "name" for your initiative in the state, which makes it more competitive when seeking state resources
  • To help establish a statewide network of persons and agencies with similar goals
  • To help lobby for legislative changes to address the issues of concern
  • To help the initiative garner recognition and resources from the state and region

Reasons for informing the public at the national level

  • To create a "name" for the initiative nationwide, which makes you more competitive when seeking resources from the state or federal government or from large private foundations
  • To help tap into nationwide networks of persons and agencies with similar goals and wide expertise
  • To help the initiative garner recognition and resources from across the country
  • To encourage community partnerships to work on the problem or issue

Three tips for making sure your findings aren't ignored

  • Give your information to the right people!
  • Address issues which those people think are important
  • Be sure the information is presented in time to be useful and in a way that's clearly understood

What are some key audiences for the data?

Who do you share this data with? Staff, volunteers, supporters in the community, and funders are all supporting groups that should be up-to-date on your group's efforts and successes. You should also share your evaluation findings with your target population--the people you are trying to help. If you work with a heart disease prevention program and your evaluation shows an increased risk factor for members of a particular ethnic group, for example, you will want to let members of that group know in order to spark their interest and make them more receptive to whatever work you will be doing in their community. Your target group needs to know what and how they will need to change to improve their health. Finally, the general public should be kept informed.

Key groups to share your evaluation findings with

Local

  • Civic organizations
  • Business groups
  • Grassroots organizations
  • School boards
  • Parent-teacher groups
  • Church organizations
  • The local press
  • Health organizations
  • Elected and appointed local government officials
  • Grantmakers

State/regional

  • State and regional professional conferences,
  • Regional professional training workshops
  • Grassroots and advocacy organizations
  • Church conferences
  • Grantmakers

National

  • Professional conferences
  • Professional training workshops
  • Grassroots and advocacy organizations
  • Church conferences
  • Grantmakers

What about difficult audiences?

  • Anticipate their questions, concerns, and objections. Think ahead of time about what this particular audience won't like about what you have to say, and come up with calm, measured, logical, and thorough responses for them. If possible, sit down with someone who knows your audience well to get their feedback on what questions or concerns they may raise and how they might react to your answers. During the presentation, this will help you keep from getting flustered and defensive.
  • Have a primary figure in your initiative present the findings. Having a director in your program deliver the data can lend a greater amount of authority to the presentation.
  • Have someone else give out the information. With a particularly hostile or uninformed audience, or an audience that you don't normally have access to, you may wish to let someone outside your group or initiative relay the evaluation findings - a member of that group, for example.
  • Reinforce the data repeatedly
  • Keep your cool

Presenting evaluation results to the press

  • Be honest with reporters. It's essential for any spokesperson for your coalition or group must have credibility with the press. Answer any questions simply and candidly, and if you can't, explain why.
  • Write your own press releases. Give reporters a clear, brief press release with all of the important information, as well as contact details, in advance in order to clear up misconceptions and help the press better prepare for your presentation.
  • Train your reporters. If your evaluation results are going to include a lot of complicated statistical information, consider doing an in -house training session for the press. These workshops should be unbiased, and they should be broad enough that the reporters can use the information learned for working with other groups, not just yours. Consider doing an informational workshop or some one-on-one presentations to explain such concepts as:
    • What do percentile scores mean?
    • What are some common methods of gathering evaluation data?
    • What are community-level indicators?

How do you communicate evaluation findings?

Develop a general presentation format that can be lengthened or shortened depending on the amount of time available, including compelling descriptions and visuals of:

  • The issue(s) of concern
  • The initiative's goals, strategies, and methods for reaching those goals
  • Data on activities (e.g., services provided)
  • Data on accomplishments (e.g., community changes)
  • Data on outcomes (i.e. behavioral measures and community-level indicators)

Keep your visuals simple to cut down on problems interpreting data. The first and last visuals should contain your message or your primary findings--whatever it is that you most want your audience to remember afterwards.

Identify different avenues of getting the word out about your evaluation results, such as:

  • Word of mouth
  • Presentations
  • Newspapers and newsletters
  • Radio - both public service announcements and local news or call-in shows
  • Television coverage
  • Professional journals

Why might you use different formats for presenting your evaluation results?

There are many different types of reports that are suitable for different types of audiences. Some groups might not be interested in or know how to interpret the statistical details, but would still like to know the general findings of your report. Funding agents, grantmakers, advisory boards, and program staff will most likely want detailed, explicit information. You should always take into consideration the type of group you're presenting the information to and tailor your presentation to that audience.

Here are some ideas of what types of reports work for what audiences:

  • Technical reports: This is a detailed report on a single issue, such as a small study on one or two sample groups. It can be given at a staff meeting or as part of a larger report.
    • Best for: Funding agencies, program administrators, advisory committees
  • Executive summary: A few pages, usually at the beginning or end of a longer report, which outlines a study's major findings and recommendations.
    • Best for: Funding agencies, program administrators, board members and trustees, program staff, advisory committees, political bodies, program service providers (technicians, teachers, etc.)
  • Technical professional paper: A detailed article that summarizes information for a scientific or technical audience. It usually contains information about what is done, how to do it, what worked and what did not work, and why.
    • Best for: Program administrators, advisory committees, organizations interested in program content
  • Popular article: An article written with the target audience of the medium in mind. Some magazines and papers target specific populations. It normally contains more information than a press release, but focuses on two or three quick points.
    • Best for: Program administrators, board members and trustees, program staff, political bodies, community groups, current clients, potential clients, program service providers, organizations interested in program content
  • News release and/or press conference: A gathering with the media done for the purpose of releasing specific information.
    • Best for: Program administrators, the media, wide distribution of simplified information
  • Public meeting: A gathering that's open to the general public where more general evaluation findings are released in a clear, simple manner, usually with some time set aside for open discussion.
    • Best for: Community groups, current clients, the media
  • Media appearance: Different from a press release in that this incorporates some sort of staged event--for example, a local author doing a public reading to highlight awareness about a study on adult literacy.
    • Best for: Current clients, the media
  • Staff workshop: A more interactive, working presentation for your group or coalition's staff and volunteers.
    • Best for: Program administrators, program staff, program service providers
  • Brochures/posters: Brief, simply-worded printed materials that can be distributed and mailed to various outlets in the community. Needs to focus on one quick point.
    • Best for: Potential clients
  • Memo: A short letter circulated internally among program staff.
    • Best for: Program administrators, program staff, program service providers
  • Personal discussion: Sitting down face-to-face to discuss evaluation findings with an individual or small group.
    • Best for: Funding agencies, program administrators, program staff, program service providers

Possible goals of your presentation

What do you want from the group you're presenting to? How you can best present the evaluation findings to get those results?

A few things you might be hoping to get back from your audience:

  • Money and in-kind resources for your initiative
  • Volunteers for project activities
  • Influence in changing a program, policy, or practice
  • Input on how to make the initiative more responsive
  • Overcoming resistance to the initiative
  • Ideas on how the initiative can become more effective

Steps in developing your presentation

  • Understand your primary users and audiences. What information do they need and why do they need it? Try to understand the audience's viewpoint, and be sure to get the report to them in time for it to be useful to them.
  • Review the results of your evaluation with program staff before you write up your evaluation report. This gives you a chance to get your staff's input on the meaning of the findings, as well as the opportunity to talk about any ambiguous data that has come up.
  • Brief any important political figures before you release your report to the public. This is especially important if your evaluation findings make it clear that there will be a need for any changes in policies. Policymakers or agency officials may wish to make a public response to your findings as well. For example, if your evaluation shows that your recreation program for families with developmentally disabled children is highly effective but underfunded, area human service agencies may wish to let people know that they plan to increase funding for the project in the next budget year.
  • Your final report can just be a short document summarizing the evaluation findings with a technical appendix for those who are interested.
  • If you decide to do an oral presentation, you should make up a small number of charts and tables--six to ten should be plenty--illustrating the most important findings. Make up one version--printed on posterboard or done as an overhead projection-- to show during the presentation itself as well as copies that your audience can keep afterwards. You may also wish to prepare a single sheet summarizing the overall results for the audience to take home.
  • Your report should begin with the reasons the evaluation was done, what questions were asked, and why those were the questions chosen. Explain what your group or coalition wanted to learn from the evaluation and what methods were used to conduct the evaluation.
  • Depending on your audience, you may want to simply highlight the results, or you may want to go into more detail about what you found. Be sure to explain what sort of implications the results have for your group or initiative. If the evaluation findings have led you to any particular conclusions about what your group should do in the future, talk about them.
Contributor 
Chris Hampton

Print Resources

Fawcett, S., in collaboration with Francisco, V., Paine, A., Lewis, R., Richter, K., Harris, K., Williams, E., Berkley, J., Schultz, J., Fisher, J., & Lopez, C. (1993). Work group evaluation handbook: Evaluating and supporting community initiatives for health and development. Lawrence, KS: Work Group on Health Promotion and Community Development, University of Kansas.

Fawcett, S., Sterling, T., Paine, A., Harris, K., Francisco, V., Richter, K., Lewis, R., & Schmid, T. L. (1995). Evaluating community efforts to prevent cardiovascular diseases. Atlanta, GA: Centers for Disease Control and Prevention, National Center for Chronic Disease Prevention and Health Promotion.

Homan, M. S. (1993). Promoting community change: Making it happen in the real world. Pacific Grove, CA: Brooks/Cole Publishing Company.

Morris, L., Fitz, C., & Freeman, M. (1987). How to communicate evaluation findings. Newbury Park, CA: Sage Publications.

Muraskin, L. (1993). Understanding evaluation: The way to better prevention programs. Rockville, MD: Westat, Inc.