News

ARTD wins Best Public Sector Evaluation Award

Aricle Image for ARTD wins Best Public Sector Evaluation Award

September 2014

ARTD in conjunction with the Australian Department of Foreign Affairs and Trade has won the Australasian Evaluation Society’s Best Public Sector Evaluation Award for 2014. The award recognises public sector evaluations that have been used to effect real and measurable change in policies or programs. In the evaluation of the Australian Volunteers for International Development (AVID) program, ARTD’s Andrew Hawkins, Emily Verstege, Chris Milne and Ofir Thaler worked in partnership with the Department’s Office of Development Effectiveness (ODE) and other stakeholders to deliver a high-quality and useful evaluation with clear and actionable recommendations. ODE undertook extensive stakeholder consultation at all stages, including subjecting the evaluation to multiple rounds of peer review and public debate. This award follows the endorsement of the evaluation by ODE’s Independent Evaluation Committee for its strong data analysis. The full report and management response to the recommendations are available on the ODE website.


ARTD at AES conference

September 2014

ARTD was a major sponsor of the Australasian Evaluation Society’s 2014 conference. Nearly 350 delegates, including 5 of ARTD’s consultants, converged in Darwin between 8 and 12 September to explore ways to unleash the power of evaluation. There was a strong emphasis on realist and Indigenous evaluation. Keynote speakers included Professor Jean King, Professor Per Mickwitz, Professor Steve Larkin and Assistant Professor Peter Mataira.

Two of our staff, Dr Margaret Thomas and Florent Gomez-Bonnet, presented on methods to assess the effectiveness of partnerships based on several recent projects. They combined three quantitative methods— partnership survey, integration measure and social network analysis—which cover the various dimensions of a partnership, including the overall partnership arrangements, what is shared between specific organisations, and individual interactions. This multi-method approach generates more robust and comprehensive findings. You can access their presentation here.

Andrew Hawkins also facilitated a roundtable on open evaluation and peer review. Participants discussed the need and processes for more efficient, rigorous, scientific and democratic or open evaluation. The roundtable elicited widely divergent views from prominent evaluators on the need to ensure better access to evaluation reports, conduct peer reviews of evaluation quality, and synthesise knowledge about intervention types from multiple evaluations.


Article on the case for realist evaluation published in peer reviewed journal

September 2014

Learning Communities: International Journal of Learning in Social Contexts has published Andrew Hawkins’ article on the case for experimental design in realist evaluation. It argues for the use for experimental approaches to test realist theory and estimate effect sizes.  This is required to meet to the needs of policy makers who are sympathetic to realist approaches to evaluation but would ordinarily seek a randomised control trial to measure outcomes. The article demonstrates how the approach can work in practice, using ARTD’s evaluation of a youth mentoring program as a case example. The journal is open source and the Special Issue: Evaluation from September 2014 can be found here. You can also download an individual copy of Andrew's article here.


ARTD develops Standard Client Outcome Reporting

September 2014

Shifting the focus of performance measurement from outputs to outcomes is a common aim among government agencies. But outcomes are harder to measure than outputs, particularly if funded services are using different tools to collect and record outcomes data. Our recent work with the Commonwealth Department of Social Services (DSS) on a streamlined approach to programme performance reporting included developing a Standard Client Outcome Reporting (SCORE) methodology. This makes it possible for a range of services to collect outcomes data in the way that best suits their unique context, but provide it to government funders in a consistent format. Services use a standard approach to translate the data they collect into a five-point rating scale for agreed client outcome domains. You can find more information about how DSS is using SCORE in their new Data Exchange Framework here.


Get better value from evaluation webinar

September 2014

Good evaluation can have a big impact, but it is not easy to get right. Andrew Hawkins will provide ten practical pointers to help you maximise the value of your organisation’s monitoring and evaluation expenditure in a webinar for the American Evaluation Association on September 4 2014.The tips are drawn from ARTD’s 25 years of experience in evaluation of government policies and programs.


ARTD evaluations on OEH website

September 2014

NSW Treasury's Centre for Program Evaluation recently commended the Office of Environment and Heritage on their Energy efficiency program evaluation page, noting that publishing reports online is consistent with the NSW Government Evaluation Framework. The page includes two evaluations conducted by ARTD. The first is an overall evaluation of the NSW Energy Efficiency programs to June 2012. The second is an interim evaluation of Home Power Savings Program (HPSP), one of the programs established under the former Energy Efficiency Strategy.