By Director, Jade Maloney
Last week we looked at how Australasian Evaluation Society members have successfully overcome some of the obstacles they have encountered to use. This week we look at main brick walls my research with Australasian Evaluation Society identified to evaluation use, and some byways around them.
Politics: Unsurprisingly, given the intertwining of politics and policy, ‘politics was the most commonly identified brick wall to use of evaluation findings. However, some had found byways around political roadblocks. These were:
Resourcing: Linked to the political roadblock, lack of resourcing was also commonly identified as a brick to use of evaluation findings. When governments change, priorities can also change and resources become unavailable. But lack of resourcing can also limit the take up of findings from pilot program evaluations. However, by equipping communities with evaluation findings they can attract other sources of funding.
Timing: Some evaluators reported that delays between a report being written and it being released can limit the relevance of findings for use. In contexts with staff on short-term contracts or on rotation, staff can have moved on, and the project have ended by the time findings are available.
More broadly, limited timeframes for evaluation can limit the type of questions that can be answered by an evaluation and, thereby, the ways in which it can be used. In this case, it was important to settle on reasonable indicators.
Cross-agency collaboration: Two evaluators involved in cross-agency initiatives said that the blurred lines of responsibility in these contexts could create roadblocks to evaluation use. Cross-agency response systems are required to address this.
Dissemination of findings: Lastly, some evaluators identified a lack of processes for broadly disseminating evaluation findings as a barrier to use. This is particularly important given evidence of the need for an accumulation of evaluation evidence before change occurs. However, the increase in policies requiring government agencies to publish findings is shifting this roadblock (although this may have its own implications for the type of learnings that get reported). Additionally, one internal evaluator said their agency had established an internal ‘lessons learned’ register to support broader use of findings; while another had established evaluator networks to share learnings. Byways unidentified by external evaluators included presenting at conferences, specifically identifying broader learnings for policy design and delivery, and negotiating with clients to share general learnings from evaluations with other clients.