Learn

Effect use of evidence is crucial to inform the development of effective and appropriately targeted policies and programs, as well as ongoing program management and strategic decision-making. We provide tailored training and capacity building projects to meet the specific needs of individual clients.

Our services include:

  • development of organisation-wide evaluation policies and plans
  • advice on evaluation and monitoring
  • tailored training and workshops in program logic development, evaluation concepts and performance monitoring
  • mentoring for individuals and teams delivering internal evaluation projects
  • development of evaluation toolkits and guides
  • peer review of evaluations.

Our senior staff use their specialist skills and expertise, along with an understanding of effective adult learning techniques and change management processes to ensure the knowledge gained through our training and capacity building projects is used in the workplace.

Capacity building services can be delivered as a stand-alone project or as part of the delivery of a larger evaluation or monitoring project. We recently used our evaluation expertise working with Patricia Rogers of RMIT University to develop the NSW Evaluation Toolkit.


Project examples

Click on any of the following projects for more information.

Literature review on impact evaluation methods

(Department of Industry and Science, 2015)

Literature review on impact evaluation methods

(Department of Industry and Science, 2015)

Impact evaluation is a broad topic that includes descriptive, underlying and evaluative questions. The appropriate approaches and designs will depend on the nature of the program, the questions being asked, and the time and resources available. This project was conducted in conjunction with Professor Patricia Rogers of RMIT and betterevalation.org to provide the department with advice on selecting impact evaluation methods. The report was provided in the context of loud calls to apply randomised controlled trials (RCTs) to improve the rigour of the evidence base for public policy. The project identifies a key role for RCTs but one that is subordinate to the overarching goal of evaluation for generating reliable and valid evidence to inform decision making. The report was provided to the department and presented in a panel format including discussion of different perspectives on impact evaluation. The report has not only stimulated discussion with in the department, but provided advice and on different approaches for conducting impact evaluation depending on the nature of the program, the questions being asked and the resources available. The report has been published on the Department’s website and can be found http://www.industry.gov.au/Office-of-the-Chief-Economist/Publications/Documents/Impact-evaluation-report.pdf

Program logic and evaluation training for Department of the Environment

(Department of the Environment, 2015)

Program logic and evaluation training for Department of the Environment

(Department of the Environment, 2015)

We conducted workshops on program logic with Departmental staff. We first introduced the concept and approaches using case studies, then supported staff to work on logics for their own programmes. The workshops further explored how program logic is used to design monitoring and evaluation plans.

WorkCover Evaluation Capacity Building

(WorkCover Authority NSW, 2015)

WorkCover Evaluation Capacity Building

(WorkCover Authority NSW, 2015)

We were engaged to assist WorkCover to increase the return on investment they obtain from monitoring and evaluation. They have traditionally relied heavily on survey data collection and were experiencing ‘survey fatigue' and low response rates. This problem required an analysis of contemporary approaches to evaluation to identify cost-effective approaches to fit the WorkCover context. We provided a report in two parts. The first outlined different purposes and approaches to evaluation, highlighting the importance of performance monitoring and formative evaluation for new projects and the need to delay sophisticated impact evaluation until an intervention is sufficiently definable, significant and mature. The second brought together the latest research on achieving high-quality survey data through high response rates across 18 different survey delivery modes. The report has been used to inform the evaluation design for new projects and make immediate improvements to survey techniques.

Peer review of an outcomes evaluation conducted by NSW Treasury

(NSW Treasury, 2015)

Peer review of an outcomes evaluation conducted by NSW Treasury

(NSW Treasury, 2015)

Peer reviews provide an expert independent appraisal of the quality of an evaluation. This peer review was of the evaluation of the NSW Centre for Program Evaluation’s Community Justice Program, which used a quasi-experimental propensity score matching method for measuring program outcomes. The review identified some issues with statistical power and provided advice around statistical analysis of complex interventions into complex systems and the latest advice on reporting of statistical data provided by major international journals.

Mentoring for the evaluation of the Community Proposal Pilot

(Department of Immigration and Border Protection, 2015)

Mentoring for the evaluation of the Community Proposal Pilot

(Department of Immigration and Border Protection, 2015)

Principal Consultant, Andrew Hawkins, is providing mentoring services for the internal team evaluating the Community Proposal Pilot. This has involved working with the team to conceptualise their project as something that could be evaluated, and to design an evaluation that is cost-effective and will generate useful information to inform decision-making.

The project commenced in late 2013 with a workshop to develop an evaluation framework and strategy, focusing on program logic and the key information needs of the Department and other stakeholders. Andrew’s mentoring role has continued through regular meetings and feedback and advice on draft evaluation plans and reports. The mentoring project has been extended for the duration of the pilot programme to June 2015.

Design an online evaluation toolkit for the NSW Government

(RMIT University, 2014)

Design an online evaluation toolkit for the NSW Government

(RMIT University, 2014)

The NSW Government introduced a sector-wide Evaluation Framework in 2013. ARTD, in conjunction with Professor Patricia Rogers of RMIT University, designed, wrote and piloted an online Evaluation Toolkit to support the Framework. It covers the core elements of a significant program evaluation, with a focus on a rigorous outcome evaluation. The toolkit is linked to the BetterEvaluation website, and is used as a key reference in the NSW public sector.

Mentoring for managers doing program evaluations

(Department of Immigration and Border Protection, 2014)

Mentoring for managers doing program evaluations

(Department of Immigration and Border Protection, 2014)

ARTD advised and supported programme managers to design and conduct evaluations of their programmes, including a programme to support refugee youth in Australia, the settings for a visa programme, and English language training for staff at overseas posts. ARTD also supplied technical services for surveys, interviews and data analysis. The mentoring substantially progressed the evaluations and gave the Department’s staff knowledge and skills in the practical experience of conducting an evaluation project.

Evaluation of strategy to build departmental evaluation capacity

(Department of Immigration and Border Protection, 2014)

Evaluation of strategy to build departmental evaluation capacity

(Department of Immigration and Border Protection, 2014)

This four-year strategy covered organisational and individual aspects within a rapidly changing environment, with a focus on training for Department of Immigration staff and support for evaluation projects. ARTD evaluated the strategy’s fit for purpose, effectiveness of implementation and the outcomes achieved. The methods used were an analysis of available data, a scan of the literature, surveys and interviews with stakeholders, including a sample of senior executives as customers for evaluation. To frame the outcomes, we developed an organisational maturity matrix for evaluation. The findings were used to inform the strategy for the next four years and refine how it is monitored and evaluated.

RSPCA evaluation capacity building

(RSPCA NSW, 2014)

RSPCA evaluation capacity building

(RSPCA NSW, 2014)

RSPCA NSW runs community outreach programs to address the health and welfare impact that separation from pets can cause people who are experiencing hardship. ARTD worked with the RSPCA to develop an evaluation framework for each of these programs—commencing with workshops to develop a program logic. The RSPCA is using the program logic to develop a monitoring and evaluation strategy.

Program logic workshop and reports

(Department of Immigration and Border Protection, 2013)

Program logic workshop and reports

(Department of Immigration and Border Protection, 2013)

To build evaluation capacity in the department, individual ARTD Principal Consultants conducted workshops on program logic and evaluation with staff in a range of programmes, including the 457 Visa, Refugee Youth Support, Member of Family Unit, English Language Training for offshore staff, and Customary Adoption. The workshops helped staff to review their strategies and develop their monitoring and evaluation.

Map capacity for evaluation in key NSW Government agencies

(NSW Department of Premier and Cabinet, 2011)

Map capacity for evaluation in key NSW Government agencies

(NSW Department of Premier and Cabinet, 2011)

We designed a framework then applied it through interviews with senior managers to profile evaluation capacity across key agencies involved in COAG National Partnerships. The mapping informed future directions for central agencies to support evaluation.

Scope evaluation guidelines for NSW agencies

(NSW Treasury, 2010)

Scope evaluation guidelines for NSW agencies

(NSW Treasury, 2010)

ARTD was engaged to develop a scoping paper for a set of evaluation guidelines for the NSW sector. The project identified best practice components of evaluation guidelines that have been developed by NSW agencies, and the use of evaluation guidelines within other jurisdictions, including the United Nations, the World Bank and the OECD. The project outlined the scope and content of possible guidelines to promote wider and more effective use of evaluations by NSW Government agencies, including the development of evaluation and meta-evaluation strategies and information and technical advice to guide evaluation activity.