The fifth stage of the Public Health Approach to Injury Prevention is evaluation and involves assessing and making judgements about whether the intervention implemented in stage four made an impact on preventing injuries in the community.
Evaluation is important to; assessing if an intervention achieved its intended aim, understanding why it may not have achieved the aim, measuring the efficiency of the intervention, assessing the sustainability of the intervention and informing how to improve the intervention.
How do I structure the evaluation of my intervention?
Ideally you should be planning for your evaluation at the same time as you are planning your intervention, to save time and ensure you are collecting all of the required data.
When establishing your evaluation plan, it is important to determine what you want and need the evaluation of your project to tell you. Evaluation questions generally tend to be clustered around three main constructs (what happened, was the intervention successful and what have we learnt), however the scope of your evaluation should be tailored to your project.
There are a number of templates that can be utilised to structure your evaluation plan, including a program logic model. This model provides a pictorial snapshot of the proposed intervention, the core elements of the intervention, supports the identification of any gaps within the plan and clarifies which elements will be measured. Click here to access a template modified from the Department of Health WA’s Research and Evaluation Framework, to assist you in developing a program logic model.
Types of evaluation
There are three types of evaluation that allow different aspects of the intervention to be evaluated and collectively contribute to the evaluation of the overall intervention;
- Process evaluation measures the intervention activities and the extent to which it has been implemented.
- Impact evaluation usually correlates to the program objectives as it is concerned with the immediate and short-term effects of the program.
- Outcome evaluation focuses on the longer-term effects of the program and usually relates to the program aims.
Once you have identified the scope of your evaluation plan and the key questions that need to be answered, you can identify what tools would assist in collating the required information. These tools can be quantitative in nature, meaning they collect numerical data (i.e., pre/post surveys and hospitalisation data), or qualitative which collect written or spoken data (i.e., interviews or document analysis). Depending on what you are evaluating and the resourcing that have available, your evaluation design may include both quantitative and qualitative methods.
Regardless of the evaluation design that you choose to conduct, you should try to make sure that they utilise validated tools and are rigorous by incorporating a range of methods. To truly demonstrate the impact of your intervention, the evaluation design must demonstrate what would have happened in the absence of the intervention.
Data analysis and interpretation
Data analysis involves identifying and summarising the key findings, themes and information gathered. The analysis and interpretation of the data may primarily occur at the end of the project, however ongoing analysis supports the evolution of the project during its implementation.
Qualitative data can be analysed by identifying and grouping data based on major themes, while quantitative data is commonly analysed by frequencies or percentages.
To minimise any bias when interpreting the results it may be appropriate to ensure that the person responsible for data analysis is different from the person responsible for implementing the intervention.
Recommendations, dissemination and communication
No matter what data analysis techniques are used or what the results are, by developing recommendations, disseminating the findings and gaining an understanding of why the results occurred, your project’s evaluation can play a significant role in supporting the development of future interventions.
Depending on who the target audience is for the dissemination of your results, their format and the type of communication activities conducted will vary. For example, reports for funding bodies would differ in style and detail to presentations given to the wider community.