Search form

Section 4. Communicating Information to Funders for Support and Accountability

Tool: Evaluation Report Outline

This tool gives you an overview of the different parts of an evaluation report and what each part should include.

Front Cover

The front cover should include:

  • Program title and location
  • Name(s) of evaluator(s)
  • Period covered by the report
  • Date of the report

Lay out your front cover neatly and make it look nice - the front cover is the first thing your audience sees and it makes an important impression.

Section I - Summary (or Executive Summary)

This is a brief (two to three pages) overview of the evaluation outlining major findings and recommendations. Some folks are too busy to read any further than the summary, so make sure that this is as complete and clear as possible.

The summary should include:

  • What was evaluated?
  • Why was the evaluation done?
  • What are the major findings and recommendations?

And, if space permits:

  • What audience is the report aimed at?
  • What decisions, if any, need to be made or have been made based on the results of the evaluation?
  • Who else might find the report to be of interest or importance?

Section II - Background Information About the Program

Presumably, most of the people reading your evaluation report will at least be somewhat familiar with the program, but that's not necessarily the case. And even people who are familiar with the program may have some misconceptions, so take the time to make your goals, strategic plan, organizational structure, and other essential program elements clear.

Typically, this section will include:

  • Origins of the program
  • Program goals
  • Clients involved with the program
  • Administrative/organizational structure
  • Program activities and services
  • Materials used and produced by the program
  • Program staff

Section III - Description of the Evaluation

This part explains why an evaluation was done and what you hoped to learn from it. It should also explain anything the evaluation was not intended to do.

Here are some of the questions that should be answered by this section:

  • Who requested the evaluation?
  • Was the evaluation meant to satisfy any particular audience and, if so, which one(s)?
  • Were there any restrictions to the evaluation in terms of money, time, or other resources?
  • Was any particular kind of evaluation design used and, if so, why?
  • What was the timetable for collecting data?
  • For each measure, what sort of data was collected?
  • What sort of methods were used to gather data, and why were these particular methods chosen?
  • How did the evaluators ensure accuracy?

Section IV - Results of the Evaluation

This part will explain what your findings were in detail.

This section may include:

  • All data collected - analyzed, recorded, and organized in understandable forms (charts, tables, graphs, etc.)
  • Excerpts from interviews
  • Testimonials from participants and clients
  • Questionnaire results
  • Test scores
  • Anecdotal evidence

Section V - Discussion of Results

Here is your chance to go into more detail --the why of your evaluation results.

This part should answer the following questions:

  • How sure are you that your program or initiative caused these results?
  • Were there any other factors that could have contributed to the results?
  • How are the results different from what they would have been if your program didn't exist?
  • What do the evaluators feel are the strengths and weaknesses of your program?

Section VI - Costs and Benefits

This part of the report is optional - if you choose to include it, it gives you a chance to justify your program's budget and financial choices.

If you include it, here's what we suggest you include:

  • Costs associated with the initiative (not only financial costs, but costs in terms of resources, energy, results, and staff/volunteer hours)
  • Methods used to come up with the budget
  • Benefits from the program (both financial and non-financial)

Section VII - Conclusions

After writing all this stuff up, it may be tempting to dash off a quickie conclusion to this report, but avoid that temptation!

This is a very important piece of the big pie, because this is where you make your recommendations:

  • What major conclusions about the initiative can be reached as a result of this evaluation?
  • Is there anything you feel should not be judged at this time, and if so, why?
  • Based on the evaluation results, what recommendations can you make for the program?
  • If the evaluation gives you any idea of what the future holds for the initiative, what would that be?
  • What worked well about the evaluation? What didn't work so well?
  • What recommendations do you have for anyone doing future evaluations with the program?