Put together an evaluation team of interested or key staff and managers and begin by developing a ‘program logic’. A program logic (known as a Logic Model) clearly states why a program is being delivered, what issue it is addressing or changing, and what outcomes are expected. The logic model can then be used by your team to agree on an evaluation framework and methodology, depending on what you want to learn or discover.
When you have clearly documented your preferred evaluation approach, you will need to develop some performance indicators which are relevant, meaningful and which you have control over. These performance indicators will help you decide on your methodology. The methodology is the manner in which you are going to collect objective and reliable information related to the performance indicators.
Your evaluation team should decide how extensive the evaluation is going to be. The size of the evaluation depends on your aims, skills, resources, budget and time. If you work in an organisation that does not have the time or budget to engage a consultant, you will need to up-skill your program staff so they can conduct the evaluation themselves. If you work in an organisation needing to collect a large amount of data and utilise interviews or surveys to collect information, you can either choose to conduct the evaluation yourself, if you have the time and skills, or engage an independent consultant to conduct the evaluation for you.
Before you move on to the implementation phase, ensure you have gained formal support and endorsement of the proposed evaluation from your organisation’s board. Without formal support for your efforts, it is likely that the evaluation will be incomplete or selective in its investigations and therefore in its results. The outcomes will not be widely accepted and may lead to inaccurate recommendations.
Implement your chosen data collection methods and ensure an ongoing commitment to the evaluation. If even one staff member or volunteer does not really understand why they are collecting the data, your results may be inaccurate.
Collect the relevant data over an appropriate period of time. One month may be adequate for short-term projects, but one year may be more appropriate if you are evaluating a larger and ongoing program. Some evaluations may need data collection to occur at the beginning as well as the end of a project, or at regular intervals, and even after the project has finished (to assess long term outcomes).
In most cases, it is best to conduct a ‘pilot’ at an appropriate time after you have started collecting the data. A ‘pilot’ means testing your data collection and results to see if any problems need ironing out before you go any further. After the pilot, continue to monitor the ongoing progress of your data collection throughout the project. If you are collecting a large amount of data, or collecting data over a long period, ensure that it is still relevant and staff remain enthusiastic and clear about the process.
Making sense of the information you have collected is the most important task! This is where you really discover if your planning and data collection methods were successful. There are many ways of collecting both quantitative and qualitative data, which means there are many ways of analysing and interpreting this data as well.
You need to assess the information collected and sort it into meaningful evidence. The manner in which it is presented will determine what conclusions are made. For example, if you present the findings that 80% of people completing a vocational course found work experience, it could mean a positive outcome and that your program was a success. However, if this 80% had problems at the work experience and left the placement before completing it, it might demonstrate that your program needs to do more life skill and preparatory work before sending the participants of the program for work experience.
Hopefully, you will find that the information collected confirms you have met your performance indicators, the number of outcomes produced, and what impact the organisation or program is having. The data may also reveal the efficiency and cost effectiveness of the program, including the total cost of delivering the identified outcomes.
Be careful that you do not fall into the trap of only collecting and reporting on positive outcomes and indicators. Evaluations that are objective and willing to state what did not go so well, or identify areas of weakness, are much more helpful and realistic. You will learn a great deal about a program by understanding its challenges and areas that need to be changed or modified.
You should present the findings of your evaluation by organising the information you have collected, analysing it, and then describing the results accurately and objectively. How you do this will depend on the purpose of the evaluation and the needs of your primary audience.
If you decide to include recommendations in your final report, then there should be a clear relationship between the findings and your recommendations. Recommendations should draw directly on what you have learned in your evaluation and what you know about the program.
If possible, involve as many people from your organisation as possible to review the findings of the evaluation. Evaluations should help you to make informed decisions that benefit the organisation, enabling you to make changes to a service or organisation based on the findings. However, be aware that if your data is too ambiguous or unreliable, you could be making decisions based on a lack of evidence or subjective opinions and judgments. This could harm your organisation and cause conflict, rather than improving your services and programs.
After the review, return to the planning process, to evaluate whether your changes have improved your program.
- Do not avoid evaluations because they take up too much time. The effort required far outweighs the benefit you will gain from completing an evaluation, however small
- Start collecting information now. Do not overlook how important collecting information over the life of a program is to an evaluation process
- Try to include qualitative data in your evaluation methodology. Quotes and personal stories are a powerful way to support findings from questionnaires
- Avoid reporting just the successful elements of the program. Be objective and report about findings that can help you improve the program
- Once your evaluation is complete, disseminate it widely. Be generous with your communication and help your funders and community partners learn from the process