Brick walls or byways to evaluation use

Aricle Image for Brick walls or byways to evaluation use

September 2017

By Director, Jade Maloney

Last week we looked at how Australasian Evaluation Society members have successfully overcome some of the obstacles they have encountered to use. This week we look at main brick walls my research with Australasian Evaluation Society identified to evaluation use, and some byways around them.

Politics: Unsurprisingly, given the intertwining of politics and policy, ‘politics was the most commonly identified brick wall to use of evaluation findings. However, some had found byways around political roadblocks. These were:

  • keeping the report in the bottom drawer until the context is right
  • arming communities with evaluation findings so they can encourage governments to act or to negotiate alternative sources of funding when required
  • using the evaluation to speak truth to power if the internal context is right
  • focusing on conceptual instead of instrumental use – the knowledge that can be applied in other projects.

Resourcing: Linked to the political roadblock, lack of resourcing was also commonly identified as a brick to use of evaluation findings. When governments change, priorities can also change and resources become unavailable. But lack of resourcing can also limit the take up of findings from pilot program evaluations. However, by equipping communities with evaluation findings they can attract other sources of funding.

Timing: Some evaluators reported that delays between a report being written and it being released can limit the relevance of findings for use. In contexts with staff on short-term contracts or on rotation, staff can have moved on, and the project have ended by the time findings are available.

More broadly, limited timeframes for evaluation can limit the type of questions that can be answered by an evaluation and, thereby, the ways in which it can be used. In this case, it was important to settle on reasonable indicators.

Cross-agency collaboration: Two evaluators involved in cross-agency initiatives said that the blurred lines of responsibility in these contexts could create roadblocks to evaluation use. Cross-agency response systems are required to address this.

Dissemination of findings: Lastly, some evaluators identified a lack of processes for broadly disseminating evaluation findings as a barrier to use. This is particularly important given evidence of the need for an accumulation of evaluation evidence before change occurs. However, the increase in policies requiring government agencies to publish findings is shifting this roadblock (although this may have its own implications for the type of learnings that get reported). Additionally, one internal evaluator said their agency had established an internal ‘lessons learned’ register to support broader use of findings; while another had established evaluator networks to share learnings. Byways unidentified by external evaluators included presenting at conferences, specifically identifying broader learnings for policy design and delivery, and negotiating with clients to share general learnings from evaluations with other clients. 

Overcoming obstacles to evaluation use

Aricle Image for Overcoming obstacles to evaluation use

September 2017

By Director, Jade Maloney

An evaluator might weep at the plethora of literature that shows their work going unused. But my recent research with Australasian Evaluation Society members shows evaluators have had success in overcoming at least some of the obstacles they have encountered to use.

Disinterest in, or more active resistance to, evaluation: Some evaluators I interviewed had been able to overcome this by: selling the value of evaluation; engaging stakeholders where they are at and from a ‘what’s in it for me?’ perspective; and involving stakeholders in a dialogue as data is collected to answer their questions.

Fear of judgement: In some cases, resistance to evaluation was related to a fear of being judged and a misconception about evaluation. Some evaluators had overcome this obstacle by convincing stakeholders they were there to help them with their work, using their interpersonal skills to make connections with stakeholders.

In some cases, fear related to evaluation construed as an accountability or compliance mechanism rather than a learning exercise. To address this, some evaluators emphasised learning and strengths-based approaches. Others, however, suggested a need to combine accountability and learning purposes or saw accountability in a more positive light.

Lack of purpose: when they encountered evaluations undertaken as a ‘tick-a-box’ compliance activity or without a clear purpose, some evaluators had been able to help stakeholders find a purpose. They helped them work through the questions they wanted to ask, and who would use the evaluation.

Negative findings: Given evidence of the human tendency to accept information that confirms our preconceptions and to refute information that challenges them (Oswald & Grosjean, 2004), it is unsurprising that evaluators had encountered more resistance to negative findings (and positive findings about programs that were slated to be discontinued). To overcome this obstacle, evaluators prepared stakeholders for what they might not expect from the outset, surfacing biases that needed to be overcome. When negative findings eventuated, they shifted resistance by socialising findings as they emerged, use the positive sandwich approach to frame findings and/or focus on lessons learned and solutions, as identified in the following quote. However, it was noted that these strategies can fail when working with organisations that lack a learning culture and when findings are politically unpalatable.

Next time, a look at the common brick walls to evaluation use and whether evaluators have alternative routes around them.

Sue Leahy joins AES Board

Aricle Image for Sue Leahy joins AES Board

September 2017

Our Managing Principal Consultant, Sue Leahy, has joined the ranks of the Australasian Evaluation Society (AES) Board.

Sue is looking forward to working with other Board members to strengthen the role of evaluation in Australasia and shape an AES that continues to meet the needs of its members for professional growth and networking.

It’s an exciting time for the AES, as we contemplate appropriate pathways to professionalisation of evaluation and continue to develop the interest and capacity for evaluation among Indigenous people. 

Congratulations Lex Ellinson – AES Emerging New Talent!

Aricle Image for Congratulations Lex Ellinson – AES Emerging New Talent!

September 2017

Senior Consultant, Alexandra (Lex) Ellinson, was awarded the Emerging New Talent at the Australasian Evaluation Society International Evaluation Conference.

The Awards Committee praised the scale of what she has achieved in just four years as well as her commitment to the profession.

Lex has designed and managed large-scale long-term evaluations of youth policy and social housing and worked in partnership with Indigenous communities. Her approach is informed and pragmatic – she has managed projects with quasi-experimental, realist-informed and co-design approaches to suit different contexts and different purposes.  

As Secretary of the AES’ Advocacy and Alliances Committee, she works to promote the use of evaluation and evaluative thinking by Australian agencies and organisations. She is also a member of the Realist Special Interest Group.

Lex discovered evaluation as a profession while studying philosophy and social sciences at university in the early 2000s, and sensed that, “evaluation nicely brings together my academic interest in the rational grounds for making normative judgements based on empirical data, which is a longstanding philosophical question.”

Accepting the award, Lex couldn’t help but demonstrate the reflective thinking that we know and love. She reflected on the inquisitive minds and commitment to doing good that she’d encountered at the conference and her excitement at being part of a profession capable of having such wide-ranging conversations.  At Gill Westhorp’s pre-conference workshop, they’d discussed ‘everything from the ontology of a tea cup to patronage systems in Laos’. 

Congratulations Lex. We’re sure there are more great things to come!

Can evaluation live up to its potential?

Aricle Image for Can evaluation live up to its potential?

September 2017

On a chilly Canberra morning, Sandra Mathison lit a fire under few hundred evaluators at the Australasian Evaluation Society (AES) International Evaluation Conference.

“Evaluation is a helping profession,” she told us. But it has long underperformed on its promise to support the public good because it is constrained by dominant ideologies, is a service for those with resources, and works in closed systems that tend to maintain the status quo.

Mathison challenged us to think about how we could turn this around, suggesting two ways forward. The first was ensuring those who are meant to benefit from a program have input into evaluation and access to the findings so we are ‘speaking truth to the powerless’ (No that’s not a typo, the more clichéd ‘speaking truth to the power’, she said was ‘a valiant but often futile act’). The second was more ‘independent’ evaluation – that is, evaluation independent of funding agencies and perhaps even independent of individual programs in ‘place-based’ evaluation. (Intrigued? Read about the rest of her speech in Stephen Easton’s piece in The Mandarin.)

On Day 2 Richard Weston from the CEO of the Healing Foundation took up the torch and challenged us to reflect on how evaluation could recognise and value Indigenous knowledges and collective as well as individual outcomes. And on Day 3 Dugan Fraser told us that we could conceptualise evaluation, not as assessing efficiency and effectiveness, but as supporting democracy – as a means of empowering citizens to interact with power and ask questions.

The concept of ‘speaking truth to the powerless’ clearly cut across the conference sessions and resonated with the audience. Less certain was the response to Mathison’s call for more ‘independent’ evaluation. One member of the audience asked Mathison whether we should not instead be focused on democratising evaluation.

Certainly, what Mathison was proposing is not inconsistent with participatory approaches and design thinking (another big theme of the conference). But it does seem somewhat at odds with Patton’s call in Utilisation-Focused Evaluation to engage all key stakeholders in the evaluation process.

In their session on policy logic, Carolyn Page and Russell Ayres set out the value of bringing all levels of expertise to the table. Our Director Jade Maloney highlighted the importance of engaging stakeholders, forging relationships and communicating well to support use – findings from her research into evaluation use in the Australian context.

Others identified ways to hardwire evaluation into government infrastructure. Economist, Nicholas Gruen championed the idea of an Evaluator General. In another session, the Commonwealth PGPA Act 2013 was identified as providing an opportunity if evaluation can take up the challenge – that is, if evaluation can help agencies understand their performance, alongside monitoring and KPIs. This will require better integration across evaluations.

Thankfully, the arguments about hierarchies of evidence that dominated conference discussions a decade ago seem to have fallen by the wayside in the wake of more nuanced discussions about what constitutes good evidence. At the conference dinner, Nicholas Gruen set the challenge for evaluation to ask the right questions and noted that randomised control trials (RCTs) are not the panacea they’re made out to be. (Yes an economist said this!). The next morning ARTD Director Andrew Hawkins extended this idea by outlining a framework for choosing different methods depending on the causal questions being asked and the resources available.

All this, and we haven’t even mentioned how realist evaluator,, Gill Westhorp, gave us five different ways to think about classifying mechanisms and prompted us to think more about how we use theory in evaluation at all levels. But there’ll be time for that at the International Conference for Realist Research, Evaluation and Synthesis in October.

So we’ll end with the themes that will stick with us. We need to work meaningfully with beneficiaries, trial and fail fast, share and build on learnings from success and failure, break down silos and engage in cross-disciplinary conversations. If we do this, we believe evaluation will live up to its promise.

Let’s continue the conversation through AES network meetings until we all get together again next year!


aes17 International Evaluation Conference

August 2017

ARTD is looking forward to discussing how to make evaluation a more useful and effective governance tool with other members of the Australian Evaluation Society (AES) next week at the aes17 International Conference in Canberra.

ARTD is again a sponsor of the event, which runs next week 4-6 September at the National Convention Centre. This year we’re sponsoring keynote speaker Sandra Mathison, who will examine whether evaluation contributes to the public good. For more information about the conference head to:

This year, two of ARTD’s directors are also presenting.

Andrew Hawkins’ presentation ‘Intervention: putting the logic back into logic models’ (Tuesday 5 Sep, 9.30am, Murray Room) will examine the shortcomings of logic models and challenge practitioners to better explain causal mechanisms. Click here for more information on Andrew’s presentation.

Jade Maloney’s presentation ‘Evaluation: what’s the use?’ (Tuesday 5 Sep, 3pm, Torrens Room) will highlight findings from her new research, which examines the problem of non-use and how practitioners can ensure their evaluations don't end up gathering dust on a shelf. Click here for more information on Jade’s presentation.

If you are at the conference, please come and speak to one of our senior staff.

Increasing access and inclusion in Fairfield

Aricle Image for Increasing access and inclusion in Fairfield

August 2017

On Monday 7 August, Fairfield City Council launched their disability inclusion action plan. The plan details what Council will do to encourage positive community attitudes and behaviours towards people with disability, make it easier to get around the community, support access to meaningful employment opportunities and improve access to Council information, services and facilities.

The launch represented the completion of all local council DIAPs in NSW, required under the NSW Disability Inclusion Act 2014. It was also a valuable opportunity to connect the local community with disability service providers, with Council hosting information stalls for local NDIS-registered providers.

Fairfield City Council’s Mayor launched the Plan. The launch was also attended by the Minister for Disability Services, Ray Williams, who explained that disability inclusion action planning is both “unique and unprecedented in Australia. It means that State and Local Governments are working together to ensure people with disability can fully enjoy the opportunities available in our communities.”

ARTD worked with Fairfield City Council to develop their DIAP, using a co-design approach grounded in the principle of ‘nothing about us without us’. We consulted people with disability, their family and carers and local service providers to understand the barriers people face when interacting with Council and getting around the community, and what they saw as the priorities.

To ensure the consultation process was accessible and inclusive, we used accessible venues with hearing loops installed, and drew on bilingual educators to conduct consultations in local community languages. We also worked with our partner, the Information Access Group, to produce an Easy Read consultation guide, as well as the final DIAP in Easy Read format.

We then worked with Council staff and managers to develop a realistic plan of action for the next four years and indicators to track progress. 

You can access Fairfield City’s DIAP on the Council website.  

[Photo by Fairfield City Council, featuring Mayor, Frank Carbone; Fairfield City Leisure Centres trainer Fred Zhao; and the Hon Ray Williams MP]

Automation set to transform the government service landscape – 2017 IPAA Conference

July 2017

A new round of digital transformation and automation is about to transform the government workforce and the way government delivers services.

The 2017 IPAA conference – of which ARTD was a key sponsor – emphasised the need for the government sector to think creatively about how digital technologies will work for staff and clients.

ARTD Managing Principal Consultant, Sue Leahy, says that the clear message at the conference was that while the focus tends to be on the technology, managing digital change starts with understanding the people issues.

Managing change across the sector will mean understanding how digital technologies can be used to improve the workplace or client experience.

We need to put people at the centre of digital transformation. That’s where innovative models of engagement will come in – to ensure we deeply understand what people need from government services.

Artificial intelligence (AI) technology is also advancing rapidly and will soon transform the way we work in government, reducing transactional, information sifting type work and creating new roles. AI technology has the power to reduce paperwork and change the way we deliver services but it will also require active change management as people’s roles are disrupted. New jobs such as robotic orchestration, robot teachers, empathy trainers, business ethics (reviewing inputs and outputs), and privacy guardians will be created.

To ensure that we get the best out of the technology on offer, it will be important that we start with a deep understanding of the human aspects of the problems technological change aims to solve.

Co-design creatively boosts access to primary health services for Aboriginal families

July 2017

How do you make Victoria’s Maternal and Child primary health services more accessible and culturally appropriate for Aboriginal families?

That’s the question ARTD, the Victorian Government and Aboriginal people have been grappling with in an innovative co-design project aimed at understanding the barriers that prevent Aboriginal families from using the universal primary health services.

As part of the Victorian Government’s $1.6 million program, Roadmap for Reform: Strong Families, Safe Children, ARTD was commissioned to provide a way for Aboriginal communities to have a voice in designing the service models they need.

The co-design process recognises that Aboriginal people bring experience and expertise about accessing services to the table. It reflects a new approach to designing services for people rather than asking them to fit the mould.

ARTD used a range of narrative-based techniques to deeply understand the needs of the target group before collaboratively generating ideas with stakeholders about the best ways of meeting those needs.

The process produced a service model to support self-determination and deliver high quality, coordinated and culturally safe Maternal and Child primary health services for Aboriginal families that will be trialled in local government, Aboriginal Community Controlled Organisations (ACCOs) and in integrated partnerships. 

Digital disruption and the IPAA NSW State Conference

June 2017

Digital technologies are continuing to transform the way that governments operate and how they interact with citizens and clients of services. Increased mobility, connectivity and data storage capacity provide opportunities to improve services and productivity.

We work with government agencies aiming to harness these opportunities. We can support agencies to ensure their system design is user-centred and accessible to people of all abilities. We can also assist agencies to engage with their clients and citizens through secure online platforms with the ability to tailor communications and surveys to particular target groups. We have also designed an innovative app for agencies to monitor and capture their client’s important life outcomes.  

To support digital transformation in NSW, ARTD is sponsoring the IPAA NSW 2017 State Conference on 15 June. This year’s theme is ‘Innovating, Emulating and Enabling: How the digital transformation is changing public sector skill sets.’ More than 20 experts will speak on topics such as data security, big data, and how the NSW Public Sector is currently embracing digital technologies. For more details about the IPAA and the Conference, visit:

If you’re at the conference come and speak to one of our staff at the coffee stand.

Find out more about the projects being funded to support the transition to the NDIS

April 2017

Wondering how people with disability and their families, as well as service providers are being supported to make the transition to the National Disability Insurance Scheme? What about initiatives to grow the market and develop innovative support options that respond to the choices of people with disability?

The NDIA, the Commonwealth and State and Territory Governments are undertaking a range of activities to support the transition. A Sector Development Fund was also established to fund activities that support the transition between 2012–13 and 2017–18. Combined, these activities will assist in the establishment of a flourishing disability support market that offers people with disability choice and control.

To date, a range of initiatives supporting people with disability and their families, providers, and workforce and market growth have been funded through the Sector Development Fund. You can find out more about what these activities are and access the resources they have produced through ARTD’s profiles of the funded projects now on the NDIS website  

ARTD’s Independent Review of Special Religious Education (SRE) and Special Education in Ethics (SEE) in NSW has been released

April 2017

In 2014, the NSW Department of Education commissioned an independent review of the implementation of SRE and SEE classes in NSW Government schools ‘to examine the implementation of SRE and SEE and report on the performance of the Department, schools and providers’. The Review was commissioned in response to Recommendation 14 of the Legislative Council General Response Standing Committee No. 2: Report No. 38 Education Amendment (Ethics Classes Repeal) Bill 2011 (May 2012) which also specified areas for the review to cover. These became the basis of the Terms of Reference:

 1.The nature and extent of SRE and SEE

2. Department of Education implementation procedures for SRE and SEE including: parent/ carer choice through the enrolment process and opting out; approval of SRE and SEE providers by DoE; authorisation of volunteer teachers and curriculum by providers

3. Development of complaints procedures and protocols

4. SRE and SEE providers’ training structures

5. Registration of SRE and SEE Boards, Associations and Committees

6. New modes and patterns of delivery using technology

7. Pedagogy, relevance, age appropriateness of teaching and learning across all Years K to 10 and teaching and learning in SEE in Years K to 6 in a variety of demographics

8. The need for annual confirmation by parents and caregivers on SRE choice or opting out

9. Review of activities and level of supervision for students who do not attend SRE or SEE.

 The Review examined the implementation of SRE and SEE in NSW Government schools between December 2014 and September 2015. This report outlines findings related to each Term of Reference and makes recommendations. Because SRE and SEE are quite distinct, they are dealt with separately throughout this report. In the current context, there are polarised views in the community about the place of SRE or SEE in NSW Government schools. While the continuation of SRE or SEE in NSW Government schools is out of scope of this Review, this was a concern for many people and influenced responses to the Review.

 Review methodology

 The Review used a comprehensive mix of methods to collect quantitative data across all schools, and the wider community, as well as in-depth and qualitative data from key stakeholders. The methods were chosen to allow all interested stakeholders and the community the opportunity to present their views so that the findings and recommendations are based on a systematic and balanced assessment. Evidence was reviewed and data collected between December 2014 and September 2015.

 The main methods for the Review were:

 ▪ Document scan. Departmental and provider documents/ websites were reviewed, including the 2014 and 2015 SRE and SEE policy and implementation procedures, and the websites of all current providers in December 2014 for their SRE or SEE curriculum scope and sequence documents and outlines.

 ▪ Curriculum review. An experienced education expert conducted a systematic criterion-based assessment of curriculum materials, based on materials from current SRE providers and the current SEE provider.

 ▪ Consultations

Surveys and interviews. Systematic data were collected via surveys of key stakeholder groups. Opportunity to respond was offered to all principals (46% response rate), all SRE and SEE providers (80% response rate), all providers’ SRE coordinators (60% response rate) and all SEE coordinators (48% response rate). SRE and SEE teachers contributed via an online portal. These data were complemented by semi-structured interviews with members of the program evaluation reference group, and with peak provider, education and other relevant groups.

Cases studies. To examine how SRE and SEE is delivered in schools at the local level, the Reviewers undertook 14 case studies involving 12 SRE providers from 11 faith groups; and two case studies of the delivery of SEE. The case studies used faceto-face interviews with coordinators, teachers, principals, and other stakeholders. They were effective in telling the story of local delivery in very different contexts.

Online community consultation. To collect perspectives from the broader community under the Terms of Reference, online contribution portals for parents/ caregivers; and other interested parties were set up and accessible for six months. The Review received over 10,000 responses, reflecting the high level of interest in sections of the community. The Reviewers recognise that while the responses reflect significant issues for those who responded, to some degree they reflect the two polarised positions in the community around SRE and SEE, and cannot be considered as representative of the whole NSW community. Indeed, the Reviewers are aware that some groups were active in encouraging their constituents to contribute, and in some cases suggested wording.

 Confidence in the findings

Overall, the Reviewers are confident that the findings from these methods reflect the broad patterns of implementation of SRE and SEE and provide a sound basis for addressing the Terms of Reference and making suitable recommendations. The methods were implemented effectively and there was a high degree of consistency between the wider findings from the surveys; the interviews/ group discussions with significant stakeholders; and the on-ground findings from the local case studies. The data from the online contribution portals is less balanced and has been used with caution, but it is generally not inconsistent with the other methods, and has been useful in raising issues.


The Review made fifty-six recommendations, based on the evidence presented in the report.

 You can access the full report on the Department of Education website using this link.

 The release of the Review and the NSW Government’s response was reported in The Guardian, The Sydney Morning Herald, The Australian and the Daily Telegraph.

Youth Frontiers evaluation released

Aricle Image for Youth Frontiers evaluation released

April 2017

The findings of our evaluation of Youth Frontiers—a NSW-wide mentoring program—have been released by the Youth Participation & Strategy Unit in the NSW Department of Family and Community Services (FACS). Overall, the evaluation found the program had an impressive reach across NSW and was achieving positive developmental outcomes for young people. However, it is not making a direct contribution to increasing community participation, so there are some opportunities to strengthen the program’s design.

The report is being used inform implementation of the program in 2017 and strategic decisions about the program mix that will support FACS' priorities for youth development in NSW. These priorities are set out in the NSW Strategic Plan for Children and Young People.

This report follows on from our evaluation of the initial pilot of the program, the findings of which were used to inform the ongoing roll-out of the program by the four funded providers delivering mentoring to students in Years 8 and 9.

ARTD is continuing to support successful implementation of the program through a monitoring system that includes post-program surveys to mentors and mentees and reports on these.

Planning with people with disability

March 2017

Want to get the voice of people with disability in policy planning or program evaluation, but unsure how to go about it?

In NSW, government agencies and local councils need to develop disability inclusion action plans (DIAPs) with people with disability. More broadly, as policy co-design catches on, government agencies need to make sure consultation processes are accessible and inclusive.

A commitment to inclusion and attention to access upfront and along the way will enable people with disability to shape policies that affect their lives and communities.

When our Principal Consultant, Jade Maloney, worked with Mosman Council on their DIAP, both organisations had a clear commitment to making the process and the final product accessible and inclusive. This meant using accessible venues and working with our partner, the Information Access Group, to produce an Easy Read consultation guide, as well as converting the final Mosman DIAP into Easy Read.

Easy Read is a way of presenting information in a way that is easy to understand. In its simplest form, it uses images to support text, a large font size and plenty of white space to increase readability. It makes documents more accessible for people with intellectual disability, people with literacy issue and people who speak English as a second language.

“We’re committed to ‘nothing about us without us’”, Principal Consultant, Jade Maloney, says. “But we’re also conscious that it can be very difficult for people with disability to even get into community consultations, let alone share their views. So we need to make sure we provide information in accessible formats, use venues and formats that are accessible, give people time and different options to share their views.

“In recent years, we’ve worked with the Information Access Group on a range of policy development and evaluation projects. We’ve found the Easy Read versions have worked well to break down complex concepts and give people with intellectual disability the space to share their views.

“When we ran the public consultation sessions for the National Disability Insurance Scheme (NDIS) Quality and Safeguard Framework we also developed a process to make sure we’d covered off on all access requirements—from physical access for people with limited mobility, hearing loops and closed captioning for people with hearing impairments and Auslan interpreters for those who use sign language.

“We’re currently putting this experience into practice again working with Fairfield City Council to develop their DIAP."

You can see what actions Mosman Council is taking to increase inclusion and job opportunities for people with disability and make it easier for people to get around in their community in their DIAP

Devil in the detail: Why ethics processes for evaluation must be revisited

March 2017

Formal ethics approval processes designed to protect study participants are often causing their own unintended harms in the context of program evaluation.

This is the somewhat controversial message, Principal Consultants Sue Leahy and Jade Maloney, gave to a full Australasian Evaluation Society (NSW) forum, hot on the heels of their presentation at last year’s AES conference in Perth.

They don’t deny that ethical practice is critical to evaluation, just that formal ethics approval processes are always needed or always result in an ethical evaluation. 

“It’s our experience that the rules around ethics approval are often applied differently across government and that the requirement for external approval can mean the voice of the ’vulnerable’ people accessing the service is not heard in the evaluation because of the timeframes or because the consent process is overwhelming,” Jade says.

When it comes to consulting with Aboriginal communities in NSW, it gets even more difficult.

“While the principles around ensuring Aboriginal communities have ownership of the research in their communities are right,” Sue says, “We question the characterisation of all Aboriginal people as vulnerable. We also find the timeframes for approval through the Aboriginal Health and Medical Research Council can mean that consultation with Aboriginal communities is squeezed out of the evaluation timeframe.

The main problem seems to be that ethics processes were set up to approve clinical trials, but evaluations are not interventions. That’s why ARTD is driving a discussion in the evaluation community about options to improve the process.

AES members at the forum recognised that ethics approval processes bring significant benefits to the evaluation process, but they agreed change is needed. Over the coming months, ARTD hopes to explore options with colleagues, and to work on guidelines to help governments identify when they need formal ethics approval processes.  

Capturing outcomes with Caretakers Cottage

February 2017

Through the Expert Advice Exchange (EAX), ARTD have worked witrih Caretakers Cottage to capture the outcomes of their transitional accommodation service, Options Youth Housing.

The EAX, an initiative of the NSW Government’s Office of Social Impact Investment, connects non-government organisations and social enterprises with leading legal and professional services firms, financial institutions and other companies to provide pro bono on growing and sustaining their impact.

When Caretakers Cottage first came to us, they said they needed some support with strategic planning. But after our Managing Pncipal Consultant Sue Leahy and Senior Consultant Alexandra Ellinson talked with them further and did a detailed SWOT analysis with the team, we realised that what they needed was to capture their outcomes. So we worked with them on a tool to capture client wellbeing in seven domains, from living skills and education to accommodation, family connections, health, safety and wellbeing. By applying the tool at entry, every three months during the service, and after leaving, caseworkers systematically collect information to assess and document changes. Caseworkers can also use the assessment as an opportunity to have supportive conversations with clients about their wellbeing, and to develop a shared understanding of their progress.

Caretakers Cottage was concerned about not creating more work for the sake of doing more work. But they have found that the tool has helped them to better monitor their work and to focus on what the client needs. As well as increasing the staff’s capacity to consistently assess client wellbeing, it really energised the staff because they are now collecting evidence that shows the impact of their work. 

NDIS Quality and Safeguards Framework released

February 2017

The Framework sets out a nationally consistent approach to quality and safeguards for the NDIS to be introduced in 2018–19. This is needed to ensure that capability is built in the new market-based system, the rights of people with disability are upheld, the benefits of the NDIS are realised, and that people have access to the same safeguards no matter where they live.

The Framework is aligned with the NDIS principle of choice and control and takes a risk based approach. It consists of developmental, preventative and corrective measures targeted at individuals, the workforce and providers. Key components are:

  • an NDIS Registrar to manage the NDIS practice standards and certification scheme, establish policy settings for worker screening, register providers, oversee provider compliance, and monitor the effectiveness of the NDIS market
  • an NDIS Complaints Commissioner to facilitate the resolution of complaints about providers of NDIS-funded supports and investigate serious incident reports and potential breaches of NDIS code of conduct
  • a Senior Practitioner to oversee approved behaviour support practitioners and providers, provide best practice advice, review the use of restrictive practices and follow up on serious incidents related to unmet behaviour support needs.

ARTD worked with DSS, State and Territory Governments and the NDIA to develop the framework. The voice of people with disability, their family members and carers, service providers, advocacy groups and representatives of professional organisations captured through public consultations and submissions informed the framework design.

You can access the framework here You can also see our report on the consultations here

Governments have indicated there will also be further opportunities to contribute to the Framework in the design and implementation phases.

Broadening your evidence cost-effectively

February 2017

ARTD Director, Andrew Hawkins, is speaking at the Strengthening Evidence-Based Policy Improving program outcomes and success conference in Canberra on 21 March 2017. He joins other thought leaders in policy, research and evaluation to discuss how sound evidence can be used to improve policy.

Andrew’s presentation will draw implications from the academic research to meet the needs of policy makers for cost-effective monitoring and evaluation of interventions into complex systems. He will challenge the audience to move away from a set of assumptions about measuring outcomes that are useful when evaluating relatively simple things—like fertilisers, drugs and bridges—towards methods that are more appropriate when dealing with new or developing interventions in very complex systems. Cutting through the complexity, he will provide a step-by-step guide for monitoring and evaluation investment decisions that are appropriate to the nature of the intervention, the questions being asked, and the time and resources available.

Early bird registrations for the two-day conference close on 10 February.

Getting serious about LEGO

December 2016

Ever left a workshop feeling like you didn’t get a chance to contribute and you wouldn’t remember if someone asked you what it was all about the next day? Bored with lectures, post-it notes and role plays?

At ARTD we’re serious about creative facilitation and policy co-design. So we’re using the LEGO Serious Play method, which is based on research that shows tactile learning produces more meaningful understanding, opens up possibilities and enables dialogue.

We put the method to the test with our own staff – asking what would your relationship with your coach look like in your ideal world? After hours of play, laughter and serious discussion, the sceptics were converted.

We found out that staff want a coach who:

  • shows a genuine interest in their wellbeing – not just at work
  • can advocate for them with managers to help them manage their workload
  • is flexible and responsive as their needs change
  • recognises that they also have something to learn from their coachee
  • supports them to take on new challenges and avoid common pitfalls.

We never would have learned all this by white-boarding ideas or asking staff to fill in a survey. Playing really did help to create an honest dialogue and identify new ideas.

Stories of Lived Experience launch

Aricle Image for Stories of Lived Experience launch

August 2016

Our evaluation report on Weave Youth & Community Services was launched at a community celebration in Waterloo on Tuesday 30 August 2016. The evaluation was part of a larger Stories of Lived Experience project, which also included photographic portraits and a short documentary to capture the faces and stories of Weave’s clients and community.

The evaluation was designed to understand the features of Weave’s support models that make a difference to clients, focusing intergenerational connections and continuity of care, and to use clients’ expert knowledge of their life experiences to contribute to the mental health sector’s development and good practice.  The report celebrates the stories of Weave’s clients and community, and makes recommendations that Weave will use to inform their future directions. It also identifies lessons for the wider community sector, especially for services working with Aboriginal people and communities.

The report draws on interviews with over 50 Weave clients, as well as consultations with Weave staff and analysis of service data. To support a culturally sensitive approach that builds capacity in the local community, we worked with Weave and three local Aboriginal organisations—Tribal Warrior Association, South Sydney Aboriginal Corporation Resources Centre, and the Aboriginal Medical Service in Redfern—to deliver the project under ethics approval from the Aboriginal Health & Medical Research Council.

ARTD undertook this project as part of our corporate social responsibility to strengthen the non-government sector. The project was also supported by a grant that Weave received from Inner West Partners in Recovery.

Energy Efficiency evaluation published

July 2016

Our 2015 report on measuring energy savings from NSW Office of Environment and Heritage (OEH) 2007­-13 energy efficiency programs  has been published on their revamped website. The report was part of a four-year evaluation support project, and follows from the 2012 report. The new report syntheses savings from activities undertaken in 2013 and 2014 to address the lack of reliable evidence of energy savings from energy efficiency programs. It  highlights the substantial achievement OEH has made in moving from savings based on engineering estimates (deemed savings) to verified energy savings based on ‘before-and-after’ analysis—consistent with international best practice measurement. Methods included billing data analysis and Measurement and Verification (M&V). The measurements demonstrated that the interventions in each of the programs achieved real and substantial energy savings and reductions in electricity bills at sites studied. The capacity to generalise these findings to all sites in each sector varied with the samples of the measurement studies.

Survey findings on local government climate change adaptation published

July 2016

Local Government NSW and the NSW Office of Environment and Heritage (OEH) have published the results from ARTD’s survey of councils’ progress and needs in adapting to climate change. The online survey, conducted in June 2015,  received 186 responses covering 74 per cent of all NSW councils. Local Government NSW presented the findings at the National Climate Adaptation conference on 7 July 2016. The survey, which followed  two surveys conducted in 2010, showed councils’ capacity to assess and plan for climate change impact had increased. In the 2015 survey, over two-thirds of respondents (69%) reported having experienced impacts from climate change in their Local Government Area. Councils welcome any kind of support to prepare and respond to climate change impacts, with increased interest in technical, high-quality information compared to 2010, and growing interest in tools, such as costing of adaptation actions.

The survey can also be accessed from OEH Adapt NSW website: here or here.


Interested in evaluation in complex social systems?

Aricle Image for Interested in evaluation in complex social systems?

June 2016

Check out Director Andrew Hawkins' latest article ‘Realist evaluation and randomised controlled trials for testing program theory in complex social systems’ in Evaluation.

Realist approaches to scientific evaluation tend to be strong on theory and explanation, but lack adequate tests or means of validating theory. Randomised controlled trials (RCT) are often seen as the gold standard for research, but Andrew argues, while they are useful in some fields of science, they are less useful in complex social systems.

Andrew’s article focuses on the potential for randomisation and experimentation to provide evidence for transfactual (i.e. reusable or portable) context–mechanism–outcome configurations (CMOs) in complex adaptive systems. Realist RCTs are considered but rejected; instead a form of propensity score matching is proposed for testing realist program theory, estimating the effect size of a purported CMO, and generating scientific knowledge for developing more effective interventions into complex social systems.

Ticket to Work for young people with disability

April 2016

Andrew Hawkins (ARTD) will present results from the Ticket To Work evaluation at the Disability at Work conference with Michelle Wakefield the Ticket To Work program manager from National Disability Services (NDS). Ticket to Work is a national program that aims to improve the transition to employment of young people with disability. The study aimed to compare the employment and social participation outcomes of young people with intellectual disability who participated in Ticket to Work Program to outcomes that may be expected for young people with intellectual disability. The analysis used data from the Household Income and Labour Dynamics Survey (HILDA) and Survey of Disability Ageing and Carers (SDAC) surveys. While the sample sizes were too small to draw conclusions at this stage, the indicative data is very promising and provides a foundation for a more comprehensive analysis by the end of 2016. The study was part of a broader evaluation that also looked at issues for the strength and integration of  the networks that deliver Ticket to Work. The conference is focused on developing employment options for people with disability, and is intended for employment service providers and professionals interested in promoting opportunities for people with disability. It  will be held at the National Convention in Canberra on 30–31 May 2016. For more information see the conference program.

Evaluation of the Resilient Families Service

January 2016

In November 2013, ARTD was engaged by NSW Treasury to evaluate Resilient Families, an intensive support service delivered to families in greater Sydney by The Benevolent Society as part of the NSW Government’s Social Benefit Bond scheme.

The purpose of the evaluation is to assess the implementation and outcomes of the service over its first three years of operation from 2013 and assess the appropriateness of the measures place for the purpose of bond payments. Throughout the three-year study the service has moved to a more established stage of implementation, providing a flexible service, responsive to client needs.  ARTD’s recommendations have focused on further development of the service to optimise outcomes for families. The Preliminary and Mid-term evaluation reports can now be viewed on the NSW Department of Premier and Cabinet’s website.

Literature Review published by the Department of Industry, Innovation and Science

December 2015

ARTD’s Review on choosing appropriate designs and methods for impact evaluation has now been published by the Australian Government’s Department of Industry, Innovation and Science.

This Review was conducted in conjunction with Professor Patricia Rogers of RMIT and to provide the Department with advice on selecting impact evaluation methods. The review identifies a role for randomised controlled trials (RCTs) in making causal inferences about programme impacts—but one limited by assumptions and questions of external validity when used to evaluate public policy and programme interventions into complex social systems. Rather than any particular method, the review describes a focus for impact evaluation on understanding the nature of the programme or intervention, the purpose for which any specific evaluation is being conducted, and the resources available for generating reliable and valid evidence to inform decision making. The report outlines a large array of approaches and methods for monitoring and impact evaluation and has stimulated discussion within the Department about the most appropriate way to evaluate a number of flagship programmes. You can access the review from the Department of Industry, Innovation and Science's website.

NDIS quality and safeguards report published

November 2015

ARTD’s consultation report on the Quality and Safeguards Framework for the National Disability Insurance Scheme (NDIS) has now been published.

The report presents the perspectives of people with disability, their families, service providers and other stakeholders from across Australia. Earlier this year, ARTD worked with the Commonwealth Department of Social Services and State and Territory Government agencies to hold 16 public meetings and 7 provider meetings in capital cities and regional areas across the country. Meetings were in accessible venues, with Auslan interpreters, closed captioning and hearing loops available. Feedback was also collected through online questionnaires (585 completed) and formal submissions (220 received).

The findings of the consultation, the cost-benefit analysis, inquiries into abuse in the disability sector, and other relevant policy work will help inform decisions about the best options for the NDIS Quality and Safeguarding Framework. Based on this information, Commonwealth, State and Territory Governments will work together to prepare a Decision Regulation Impact Statement for consideration by Ministers in early 2016.

Being strategic about building evaluation capacity

October 2015

How does a large government department build its capacity for evaluation in a way that is strategic and makes the most of opportunities? And how can this be evaluated?

Chris Milne and Marita Merlene from ARTD, and Greg Bowen, who recently retired from the Department of Immigration and Border Protection, described how  to a large audience at the Australasian Evaluation Society International Conference in Melbourne.

The Department has led the way among Commonwealth agencies in building its evaluation capacity since 2010. Greg described how they took an organisational development approach and worked on both the demand for and supply of evaluation. The made use of an opportunity to design and expand online and face-to-face training in evaluation, and supported program managers through mentoring and advice from evaluation experts. A lot of work went into developing data systems, including work with the Australian Bureau of Statistics.

Marita and Chris described how the evaluation of the strategy allowed for emerging opportunities and new barriers in a turbulent organisational and policy environment. They looked at the effectiveness of individual capacity building programs, then used a complexity model to design an initial organisational maturity matrix for evaluation that highlighted the intended outcomes and show progress to date. In situations like this, the evaluation itself was part of the management of organisational change.

You can access the slides from their presentation here.

Working with Weave

October 2015

We have been working in partnership with Weave Youth & Community Services to support them to evaluate their services over the last few years. Weave is a non-profit organisation that supports disadvantaged and vulnerable young women, children and families in the City of Sydney and South Sydney areas. Over 70 per cent of their clients are Aboriginal.

Our current evaluation aims to document and share learnings from Weave’s therapeutic relationships and models of support, which are designed to improve outcomes for people with mental illness. Consultations with Aboriginal and non-Aboriginal clients and staff will help us build an understanding of what is most useful about Weave’s services, and what the sector might learn from Weave’s approach.  

 The evaluation is part of the wider ‘Stories of Lived Experience’ project for which Weave received a grant from the Inner West Sydney Partners in Recovery Program. ARTD is contributing in-kind resources to support the project. The evaluation will report to Weave in May 2016.

 You can find more information about Weave’s services on their website

Aboriginal Women Leaving Custody Strategy

October 2015

ARTD has been working with the Department of Family and Community Services and Baabayn Aboriginal Corporation to evaluate the Aboriginal Women Leaving Custody Strategy.  The Strategy was developed to inform policy and procedural changes to reduce homelessness that follows from incarceration, and to inform an effective service model that addresses the support needs of Aboriginal women in custody on short-notice release.  

Sue Leahy and Alexandra Ellinson met with the Aboriginal reference group, which includes the Baabayn Aboriginal Corporation, on the 15th of September to gain an Aboriginal community perspective on the evaluation findings. This approach is part of our commitment to working in partnership with Aboriginal communities in evaluation and research projects and feeding back findings to communities. The discussion provided significant insights into the context for the findings and helped to identify learnings and recommendations.

Baabayn Aboriginal Corporation was founded by five Aboriginal elders from Western Sydney. Their purpose is to connect with individuals and families in a welcoming environment, providing them supports and links to services that help them heal from the past and nurture their sense of confidence and pride in the future. The group has strong knowledge of the community and has built contacts within and outside the community. You can find out more about the corporation using this link.

Knockout Health Challenge evaluation published

September 2015

Our evaluation of the NSW 2013 Knockout Health Challenge has been published by Health NSW. The Knockout Health Challenge aims to address overweight and obesity in Aboriginal communities through community based teams competing in a 16 week weight loss challenge that combines group and individual exercise with healthy lifestyle information. This is followed by a less intensive maintenance phase.  

We interviewed team mangers and the state implementation team to explore the effectiveness of implementation and identify areas for improvement for future challenges. To assess outcomes, we analysed data on weight, diet and physical activity at four points in time: before and after the program, and five and nine months after the program.

The evaluation found that the Challenge was highly valued by participants and team managers, and was feasible to implement in most contexts. While participant weight loss varied between individuals and stages in the program, overall, participants’ average weight 9 months after the Challenge was significantly lower than at the start of the Challenge. Statistically significant changes were also seen in participant physical activity and daily fruit consumption over the same time period. The participants most likely to lose weight during the Challenge were women, those with the highest starting weights, and those who most frequently participated in team training.

 The full report can be accessed on the NSW Health website.

NSW Government Evaluation Conference

September 2015

Principal Consultant Andrew Hawkins has been invited to share his experience with building evaluation capacity at the NSW Government Evaluation Conference, organised by the NSW Department of Premier and Cabinet. On 25 September 2015, Andrew will join representatives from NSW government agencies on the panel ‘building evaluation capacity: a view from across the sector and beyond’. Drawing on his experience, Andrew will talk about the need for a range of approaches to build the supply and demand for high quality evaluation in a public sector agency. Evaluation mentoring can provide a cost-effective means of getting internal evaluation teams on track, engaging staff with data and, ultimately, producing an evaluation that is geared towards informing public policy and represents value for money. Mentoring can help teams to develop a program logic for an intervention, identify questions of strategic importance to their agency, locate existing data, develop methods for collecting new information and decide on data analysis, as well as provide quality assurance. For more information about the conference see the NSW Department of Premier and Cabinet website.

AES Conference

Aricle Image for AES Conference

September 2015

We continued our long tradition of sponsoring the Australasian Evaluation Society Conference, this year sponsoring the key note from Marlene Laubli Loud. Marlene is a Swiss evaluator who has worked with the Swiss Federal Office of Public Health, the European Commission, the World Health Organisation and the United Nations Evaluation Group. She described the range of challenges that internal evaluators can face, such as working across  organisational ‘silos’; managing the interplay between the organisation, its stakeholders and broader culture; and managing the tension between independence and ownership. Her strategies for overcoming these challenges are having an institutional vision and support; addressing power imbalances; developing individual and organisational capabilities; and having knowledge connectors and entrepreneurship.

Over the last few years, our senior staff have increasingly been working with internal evaluation managers and  teams to build capacity for evaluation and our Director Chris Milne and Senior Consultant Marita Merlene presented the learnings from this work with one of our clients.

Report on ACT Specialist Homelessness Service system reforms published

August 2015

Our evaluation of the ACT Specialist Homelessness Service system reforms has been published by the ACT Government. The reforms, implemented between 2009 and 2014, included shifting the system from a crisis accommodation response to a ‘support in place’ model and introducing a central point to streamline access to services and housing. To examine the system, service delivery and sector outcomes, we analysed service user data and provider reports, surveyed specialist homelessness services and partner organisations, and developed five case studies to explore key issues in depth. These were based on interviews and document analysis.

The evaluation found that the sector has provided more services, particularly non-accommodation supports, and has achieved better non-housing related outcomes for service users. However, exits into unstable housing situations (no tenure/ marginal renter) increased as ACT specialist homelessness services faced an increase in the proportion of service users entering the system in poor housing situations over the reform period. While there was some initial resistance to the centralised intake service, indications are that it is leading to a more equitable and more efficient service system. The findings have been presented to the sector and are informing discussions about future directions.

How do you judge evidence in evaluation?

July 2015

Want to learn more about judging ‘good’ from ‘bad’ evidence in evaluation? ARTD Principal Consultant Andrew Hawkins, George Argyrous from ANZSOG and Karen Fisher from the SPRC will discuss the hierarchies that have been proposed to assess evaluation evidence and the pitfalls of these at a free workshop in Sydney from 3.30–5pm on July 14. To find out more or to register for the event go to the AES website.

Evaluation strategy for Murray-Darling project

May 2015

The Murray-Darling Basin Environmental Water Knowledge and Research project is a five-year collaborative research project to improve the knowledge base for managing environmental water in the Basin. ARTD designed the evaluation strategy for the project on behalf of the Murray-Darling Freshwater Research Centre, which is coordinating the project for the Australian Government Department of the Environment. The project involved two rounds of stakeholder consultation to develop the evaluation strategy, including a program logic and detailed evaluation work plan for the two phases of the project.

Guide for services working with children and young people with disability

May 2015

ARTD worked with ADHC and a reference group of representatives from service providers and peak bodies to develop an addendum to the Standards in Action manual to guide services working with children and young people with disability and their families. The addendum provides tips and practice examples to help Family and Community Services (ADHC) operated and funded services working with children aged less than 16 years and young people aged between 16 and 18 years meet the NSW Disability Services Standards and the principles of the Disability Inclusion Act 2014. It also outlines principles for working effectively with children, young people and their families that specialist, mainstream and community-based services can use to guide their practice. You can access the Addendum from the ADHC website.

Our tips on partnership on the American Evaluation Association blog

Aricle Image for Our tips on partnership on the American Evaluation Association blog

April 2015

The American Evaluation Association (AEA365) has published a blog post about evaluating partnerships by our Senior Consultant Florent Gomez-Bonnet and Principal Consultant Dr Margaret Thomas. In this blog they present an innovative approach to evaluating partnerships, which combines three methods—a partnership survey, integration measure and Social Network Analysis—to get a more complete picture of a partnership. More detail about their approach can be found in their peer-reviewed article in the Evaluation Journal of Australasia and their presentation from the 2014 Australasian conference in Darwin.

Andrew Hawkins appointed Evaluation Fellow

March 2015

Principal Consultant Andrew Hawkins has been appointed as an Evaluation Fellow at the Centre for Program Evaluation, Melbourne Graduate School of Education, University of Melbourne. He is currently mentoring a student completing their capstone project for their Master of Evaluation degree. 

Approach to partnership assessment published in peer-reviewed journal

Aricle Image for Approach to partnership assessment published in peer-reviewed journal

March 2015

Check out the March issue of the Evaluation Journal of Australasia for a peer-reviewed article about evaluating partnerships from our Senior Consultant Florent Gomez-Bonnet and Principal Consultant Dr Margaret Thomas. They describe an innovative approach to evaluating partnerships, an increasingly important mode of delivering public services. It combines three methods to get a more complete picture of a partnership, with each method capturing data at different levels: 

  • a partnership survey (adapted from the Nuffield Partnership Assessment Tool) at the overall level
  • an integration measure (based on the Human Service Integration Measure developed by Brown and colleagues in Canada) between organisations
  • a Social Network Analysis (using UCINET) between individuals.

The diagram represents the underlying conceptual framework.

Australasian Evaluation Society members can download the full article here, non-members here. The article follows Florent and Margaret’s successful presentation at the 2014 AES Conference in Darwin. (If you missed it, you can see their slides on the AES website). 

ARTD sponsors Think Outcomes

November 2014

ARTD is sponsoring this two-day forum to bring together analysts, researchers, academics, strategic leaders and funders of social impact initiatives from government agencies, non-government organisations and the private sector to explore, learn and develop an action plan for the future of social outcomes measurement in Australia. This will help to ensure that the $250 billion spent on social purpose programs, interventions and policy in Australia each year delivers real value for people dealing with disadvantage. With our role in supporting outcomes measurement for government-funded programs across a range of policy sectors, we’re keen to support this collaborative approach to making sure evaluation, monitoring and research can best inform program design, targeting and implementation to achieve intended outcomes.

The conference, on November 20–21 in Sydney,  is being organised by The Centre for Social Impact, the Australian Research Alliance for Children and Youth and the Social Impact Measurement Network Australia. You can access the full program here.

Evaluation of student survey trial published

October 2014

ARTD’s formative evaluation of the Tell Them From Me student survey trial is now available on the NSW Centre for Education Statistics and Evaluation website. The survey provides schools with an annual snapshot of their students’ self-reported social and intellectual engagement at school, which can inform annual strategic planning. The Department can also use the data to explore how student engagement and wellbeing is linked to performance.

The evaluation drew on a survey of school principals and staff responsible for implementing the student survey interviews with the principals, survey coordinators, teachers and a sample of junior and senior students from five schools.

ARTD wins Best Public Sector Evaluation Award

Aricle Image for ARTD wins Best Public Sector Evaluation Award

September 2014

ARTD in conjunction with the Australian Department of Foreign Affairs and Trade has won the Australasian Evaluation Society’s Best Public Sector Evaluation Award for 2014. The award recognises public sector evaluations that have been used to effect real and measurable change in policies or programs. In the evaluation of the Australian Volunteers for International Development (AVID) program, ARTD’s Andrew Hawkins, Emily Verstege, Chris Milne and Ofir Thaler worked in partnership with the Department’s Office of Development Effectiveness (ODE) and other stakeholders to deliver a high-quality and useful evaluation with clear and actionable recommendations. ODE undertook extensive stakeholder consultation at all stages, including subjecting the evaluation to multiple rounds of peer review and public debate. This award follows the endorsement of the evaluation by ODE’s Independent Evaluation Committee for its strong data analysis. The full report and management response to the recommendations are available on the ODE website.

ARTD at AES conference

September 2014

ARTD was a major sponsor of the Australasian Evaluation Society’s 2014 conference. Nearly 350 delegates, including 5 of ARTD’s consultants, converged in Darwin between 8 and 12 September to explore ways to unleash the power of evaluation. There was a strong emphasis on realist and Indigenous evaluation. Keynote speakers included Professor Jean King, Professor Per Mickwitz, Professor Steve Larkin and Assistant Professor Peter Mataira.

Two of our staff, Dr Margaret Thomas and Florent Gomez-Bonnet, presented on methods to assess the effectiveness of partnerships based on several recent projects. They combined three quantitative methods— partnership survey, integration measure and social network analysis—which cover the various dimensions of a partnership, including the overall partnership arrangements, what is shared between specific organisations, and individual interactions. This multi-method approach generates more robust and comprehensive findings. You can access their presentation here.

Andrew Hawkins also facilitated a roundtable on open evaluation and peer review. Participants discussed the need and processes for more efficient, rigorous, scientific and democratic or open evaluation. The roundtable elicited widely divergent views from prominent evaluators on the need to ensure better access to evaluation reports, conduct peer reviews of evaluation quality, and synthesise knowledge about intervention types from multiple evaluations.

Article on the case for realist evaluation published in peer reviewed journal

September 2014

Learning Communities: International Journal of Learning in Social Contexts has published Andrew Hawkins’ article on the case for experimental design in realist evaluation. It argues for the use for experimental approaches to test realist theory and estimate effect sizes.  This is required to meet to the needs of policy makers who are sympathetic to realist approaches to evaluation but would ordinarily seek a randomised control trial to measure outcomes. The article demonstrates how the approach can work in practice, using ARTD’s evaluation of a youth mentoring program as a case example. The journal is open source and the Special Issue: Evaluation from September 2014 can be found here. You can also download an individual copy of Andrew's article here.

ARTD develops Standard Client Outcome Reporting

September 2014

Shifting the focus of performance measurement from outputs to outcomes is a common aim among government agencies. But outcomes are harder to measure than outputs, particularly if funded services are using different tools to collect and record outcomes data. Our recent work with the Commonwealth Department of Social Services (DSS) on a streamlined approach to programme performance reporting included developing a Standard Client Outcome Reporting (SCORE) methodology. This makes it possible for a range of services to collect outcomes data in the way that best suits their unique context, but provide it to government funders in a consistent format. Services use a standard approach to translate the data they collect into a five-point rating scale for agreed client outcome domains. You can find more information about how DSS is using SCORE in their new Data Exchange Framework here.

Get better value from evaluation webinar

September 2014

Good evaluation can have a big impact, but it is not easy to get right. Andrew Hawkins will provide ten practical pointers to help you maximise the value of your organisation’s monitoring and evaluation expenditure in a webinar for the American Evaluation Association on September 4 2014.The tips are drawn from ARTD’s 25 years of experience in evaluation of government policies and programs.

ARTD evaluations on OEH website

September 2014

NSW Treasury's Centre for Program Evaluation recently commended the Office of Environment and Heritage on their Energy efficiency program evaluation page, noting that publishing reports online is consistent with the NSW Government Evaluation Framework. The page includes two evaluations conducted by ARTD. The first is an overall evaluation of the NSW Energy Efficiency programs to June 2012. The second is an interim evaluation of Home Power Savings Program (HPSP), one of the programs established under the former Energy Efficiency Strategy.

Evaluation of Proud Schools Pilot published

August 2014

ARTD’s comprehensive formative evaluation of Stage 2 of the Proud Schools Pilot is now available on the NSW Department of Education and Communities’ website. The purpose of the Pilot was to develop a framework to support secondary schools to address homophobia, transphobia and heterosexism, and provide a safe and supportive environment for same sex attracted and gender questioning students. Research showed these young people have too often been subjected to bullying at school because of their sexual orientation. The evaluation of the Pilot in 12 schools found that it is feasible for schools to make positive changes in the school climate towards same sex attracted and gender questioning young people in the short term. Using a whole-of-school approach was more effective than single strategies. You can find the full report here.

ARTD work with DSS to develop a streamlined approach to programme performance reporting

August 2014

The Commonwealth Department of Social Services (DSS) new Data Exchange Framework streamlines the Department’s services into seven broad programs with simpler reporting requirements. ARTD Director, Michael Brooks, and Principal Consultant, Klas Johansson, supported DSS to develop the new approach. Funded services now report against a small set of priority requirements about their clients and the services they are providing. They can choose to report an extended data set, including the reason for seeking assistance, referrals (in and out), housing composition, income status, programme-specific data items, and outcomes data using Standard Client Outcomes Reporting (SCORE). An new IT system supports the transfer of data between funded services, and DSS provides regular reports back to services. These reports include not only the extended data set submitted by funded services, but draw on government data, population data and client surveys. You can find out more about the DSS Data Exchange Framework here.

ARTD sponsors IPAA NSW State conference

July 2014

IPAA’s 2014 State Conference tackled how best to transform the public service to better deliver more effective and citizen-centred services. ARTD is sponsoring this important opportunity for public sector staff and stakeholders to come together to discuss the future.

Evaluation will have a key role in informing public sector reforms, by providing agencies with a better understanding of what is working well and where improvements can be made. Senior staff from ARTD were available at key times throughout the day to listen to conference attendees’ questions about evaluation and provide advice. 

For more about the conference see the IPAA website.

ARTD at world's first behavioural insights conference for public policy

June 2014

Sydney was host to Behavioural Exchange 2014, the world’s first global public policy behavioural insights conference on June 2–3. Experts from around the world—including co-authors of Nudge, Professors Cass Sunstein and Richard Thaler, and head of the UK Behavioural Insights Team, Dr David Halpern—shared practical examples from the field. Behavioural insights or ‘nudging’, which brings together the learnings from cognitive psychology and behavioural economics to make it easier for people to make the ‘right’ choices, has been embraced by governments in the UK and the US and now in NSW.

ARTD senior staff Andrew Hawkins and Florent Gomez-Bonnet participated in the conference. They are currently working with the NSW Office of Environment and Heritage to evaluate behaviour change intervention trials for the NSW Home Power Savings Program, using a randomised control trial to measure the effectiveness of the different ways to promote reduced energy consumption (loss aversion, social norms and commitment).

One of the key messages from the conference was the importance of measuring outcomes (Test, Learn, Adapt cycle), in particular through randomised controlled trials. There is also a need to build on existing successes with ‘low-hanging fruits’ (like increasing tax payment rates through simplified forms) and extend the approach to more challenging areas of social policy. Clearly, behavioural insights will be a growing focus in the spheres of policy and evaluation of the coming years.

Lisa Charet wins ARTD-sponsored IPAA Collaboration Award

June 2014

ARTD has sponsored the NSW Institute of Public Administration Australia’s (IPAA) Collaboration Award for the third year in a row. The award recognises individuals who have established effective joint working arrangements, can clearly identify and communicate the benefits of a collaborative approach, and have delivered improved, coordinated and efficient services.

Our Principal Consultant Sue Leahy presented the award to the 2014 winner, Lisa Charet from Community Services in the NSW Department of Family and Community Services, at a gala awards ceremony on Thursday May 29. Lisa was recognised for her development of innovative, person-centred service delivery and improved outcomes for vulnerable children in need of care and protection. ARTD congratulates Lisa and all the winners and finalists in the 2014 IPAA NSW Awards for Individual Excellence.

Principal Consultant interviewed by Scottish public health body

April 2014

ARTD’s Principal Consultant for Health, Dr Margaret Thomas, visited Edinburgh recently and was interviewed by the Scottish Collaboration for Public Health Research and Policy about her experiences in public health evaluation in Australia. The interview is part of a series of Q&A sessions with leading public health figures, called 'SCPHRP Meets...'. You can watch the interview here.

Energy efficiency evaluations published

February 2014

Two ARTD evaluations of energy efficiency programs have been published by the NSW Office of Environment and Heritage (OEH). The first is an overall evaluation of the NSW Energy Efficiency programs to June 2012. ARTD worked in partnership with OEH to assess the effectiveness of the programs under the former Energy Efficiency Strategy, with a focus on the efforts made to produce reliable measures of energy savings.

The second is an interim evaluation of Home Power Savings Program (HPSP), one of the programs established under the former Energy Efficiency Strategy. The HPSP had the aim of helping 200,000 low-income households reduce their power usage and save on energy bills between July 2010 and June 2014, by providing participants with a home assessment, an action plan and a kit of energy efficient items. Our interim evaluation, which included an analysis of participant data, a detailed local case study and a cost-effectiveness assessment, informed further development of the program. We are currently working with OEH on an evaluation of the behaviour change component of program, introduced in June 2013 to test behaviour change intervention methods that could support better outcomes for participants.

AVID evaluation published

January 2014

ARTD's evaluation of the Australian Volunteers for International Development (AVID) program has been published by the Department of Foreign Affairs and Trade’s (DFAT’s) Office of Development Effectiveness (ODE). The evaluation found that the program’s volunteers contribute to the capacity of their host organisations, develop people-to-people links and generate goodwill for domestic and foreign diplomacy. Overall, AVID is making an effective contribution to the Australian Government’s development and public diplomacy objectives. However, the program’s efficiency and effectiveness could be improved through a number of actions. See the full report and management response to the recommendations on the ODE website.

ARTD evaluation commended by Independent Committee

November 2013

ARTD's evaluation of the Australian Volunteers for International Development (AVID) program has been endorsed by the Independent Evaluation Committee for Australian Aid. The Committee noted the report was thoughtful and based on strong analysis. ARTD was commissioned to evaluate this program, which places volunteers with organisations in 37 countries, to enhance its effectiveness.

Evaluation of MH-CoPES published

November 2013

ARTD’s evaluation of the Mental Health Consumer Perceptions and Experiences of Services (MH-CoPES) Framework has been published on the NSW Consumer Advisory Group Inc website (NSW CAG). The Framework has four steps—data collection, data analysis, reporting and feedback, action and change—that NSW mental health services are expected to implement in repeated cycles. NSW CAG was funded to develop the Framework to provide a way of ensuring mental health consumer perspectives inform service quality improvement. Consumer involvement in evaluation is a requirement of the National Standards for Mental Health Services but previous research shows a number of barriers to this, including staff attitudes toward consumer participation, lack of clarity about how to engage consumers and support to do so, and fear consumer participation will lead to unrealistic expectations. Consumers can also be reluctant to engage in evaluation processes because of fear of repercussions, concerns about maintaining their privacy and confidentiality, and prior experience of tokenistic consultation.

Our evaluation of the MH-COPES Framework drew on a literature scan, a survey of all services, and consultation with service managers, staff, consumer workers and consumers.Key findings were that following significant investment in the Framework by the NSW Ministry of Health and strong support from senior management, most mental health services in NSW had begun implementing the Framework. However, they had encountered a number of barriers in doing so. After the initial two-year implementation period only a minority of services had completed all four steps. But positive experiences among those that had done so suggest the Framework has the potential to support consumer participation and feed into quality improvement if some adjustments can be made to the Framework, consumer questionnaires, and structural supports.

ARTD to look at the HomeStay Support Initiative

October 2013

ARTD is now evaluating the HomeStay Support Initiative on behalf of the Queensland Department of Housing and Public Works. This initiative is an early and post-crisis intervention service that aims to support vulnerable people—particularly singe adults, families and older people—to address issues threatening their ability to maintain a tenancy. It provides a case management approach and links clients to relevant mainstream and specialist information and support services within their local community. The evaluation will explore how the HomeStay service has been established and is delivering support across sixteen sites, with a focus on four sites for case studies. It is funded under the National Partnership Agreement on Homelessness.

Findings on the impact of income and education on wellbeing

September 2013

ARTD consultant Dr Ioana Ramia presented her research findings about the impact of income and education on subjective wellbeing at the Australian Social Policy Conference 2013.  You might assume wellbeing would be associated with the material satisfaction that comes with higher educational achievement, better job outcomes and higher income, but the research says different. Researchers have often found people with no tertiary education are happier or more satisfied with their lives than those with a tertiary education.

Ioana’s research, conducted for her doctorate through the University of NSW, used data from the 2010 Household Income and Labour Dynamics (HILDA) Survey to look at this issue in Australia. She found that the impact income has on subjective wellbeing differs between those with and without a tertiary education, but the results vary with the measure of income used (individual or household).

When asked about wellbeing, those with a tertiary education think more of their homes and their free time, while those without a tertiary education think more about their salary, job prospects and neighbourhood. While they're answering the same question, they're assessing different things. For those without a tertiary education, happiness was correlated to how much money they had, but not exhaustively. After hitting the median income for people without a tertiary education people, more money does not increase happiness.

This story was picked up in The Australian, The Daily Telegraph and the Herald Sun.

Toolkit to support better evaluation in NSW

Aricle Image for Toolkit to support better evaluation in NSW

August 2013

The NSW Government launched its new Evaluation Toolkit to support NSW government agencies to undertake program evaluation projects on 21 August 2013 at the Why Evaluation Matters: Basing Decisions on Evidence Forum in Sydney. ARTD’s Chris Milne and Sue Leahy worked with Professor Patricia Rogers of RMIT University in developing the toolkit for NSW public sector managers in line with the NSW Government Evaluation Framework. It takes managers who are responsible for a program evaluation through seven steps—from initial conception, through commissioning and managing, to disseminating and supporting use of the findings. The Toolkit will be refined over time based on user feedback.

High-profile evaluation of behaviour change trials

August 2013

ARTD is now evaluating the behaviour change component of the NSW Government’s Home Power Saving Program (HPSP). This program, which started in July 2010, was established to help 200,000 low-income households reduce their power usage and save on energy bills by June 2014. It includes three main components provided at no cost to participants: 1) a home power assessment, 2) a tailored action plan that identifies ways the household can save power, and 3) a kit of energy efficient items (including a standby saver power board, light bulbs and other small items and a water-saving showerhead if needed). The $63 million program is managed by the Office of Environment and Heritage and delivered across the state by about 100 contacted energy experts.

Based on recommendations from ARTD’s interim evaluation of the program in 2012, OEH introduced a new version of the program in June 2013 to test behaviour change intervention methods that could support greater outcomes for participants. OEH designed the behaviour change trials with BehaviourWorks at Monash University and the NSW Department of Premier and Cabinet’s  Behavioural Insights Team (seconded from the UK Cabinet Office). The trials are using three key triggers identified in the behaviour change literature— commitment, social norms and loss aversion—to try to improve participants’ implementation of tips in their power savings action plans. From June 2013, each new participant is being randomly allocated to one of the three trial groups—which receive different types of follow-up support aligned with one of the behaviour change triggers—or a control group—which receives the program as usual.

ARTD’s evaluation of the trial, being led by Andrew Hawkins and Florent Gomez-Bonnet, is using a randomised control trial (RCT) to test the impact of each type of intervention on behaviour change and energy savings (measured through analysis of external billing data), combined with a mixed-methods design, based on the Theory of Planned Behaviour, to try to understand observed behaviour. Together, these approaches will provide robust evidence about the value of the program and evidence to support refinements to further promote energy efficient behaviour among low-income households.

You can find out more about the work of the UK’s Behavioural Insights Team, or Nudge Unit,here. For more detailed information on the approach, check out, Richard Thaler and Cass Sunstein’s book Nudge or Daniel Kahneman’s Thinking, Fast and Slow.

Findings on the results of 2011 Smarter Schools National Partnership Cross-sectoral Impact Survey published

July 2013

The NSW Centre for Education Statistics and Evaluation (CESE) has released the findings of  ARTD's analysis of the 2011 Cross-sectoral Impact Survey (CSIS). This survey captured responses from 662 of the 936 NSW schools participating in a Smarter School National Partnership (SSNP). In total, 4,376 individuals completed a survey: 393 principals, 1,331 executives and 2,652 teachers. This is the first of a wave of surveys being administered every year up to 2017.

The 2011 CSIS provides a snapshot of the extent of change in key education practices achieved in Smarter Schools National Partnership schools by September 2011. The CSIS documents the extent of reform at the individual staff member, school and system level and the sustainability of the reforms, through each round of surveys. The information being collected is both relative and retrospective—the survey asks respondents to compare education practices in schools and classrooms prior to participating in the SSNP with where they are now. This allows the survey to account for different subjective starting points and to ask about the added value of being involved in an SSNP.

ARTD is currently finalising the report on the findings of 2012 CSIS and preparing to administer the 2013 survey.

Evaluation of long-term housing and support projects published

July 2013

Housing NSW has published the report from ARTD’s evaluation of four long-term housing and support projects funded under the NSW Homelessness Action Plan (HAP), which set the direction for state-wide reform of the homelessness service system to achieve better outcomes for people who are homeless or at risk of homelessness. ARTD’s evaluation focused on processes outcomes achieved for the service system through four different project models. The project included a literature review on the supportive housing model, online surveys of project stakeholders, site visits to collect in-depth data, analysis of monitoring and administrative data and a cost-effectiveness analysis. The evaluations are to inform future homelessness planning.

Findings on ADHC autism programs published

May 2013

Ageing Disability and Home Care (ADHC) has released the high-level findings from ARTD’s evaluation of four models of early intervention for children with autism aged 0 to 5 years and their families. The four programs—Footprints (Autism Behavioural Intervention NSW), Building Blocks, More Than Words and Autism Pro (Aspect NSW)—were funded throughStronger Together: A new direction for disability services in New South Wales, 2006–2016. All of the programs were designed in line with best practice principles, but they varied in terms of their philosophical orientation, delivery (mode, intensity, duration) and the intervention focus (child or parent).

The evaluation compared the outcomes for children across the four programs, as well as the cost-effectiveness of the programs, drawing on quantitative and qualitative data. The findings showed that children who took part in one or more of the four programs—within the context of receiving other services, supports and  therapies through other sources, including the Helping Children with Autism Package—demonstrated improved skills, abilities and behaviours. Their parents’ knowledge, understanding and ability to cope also increased. The evaluation will inform ADHC’s decisions about the future direction of programs to support children with autism and their families.

This work builds on our other work in the autism area, in particular the evaluation of the Helping Children with Autism Package for the Commonwealth Department of Families, Community Services and Indigenous Affairs, and adds to the evidence base about effective supports for children with autism and their families.

ARTD Sponsors IPAA Collaboration Award

Aricle Image for ARTD Sponsors IPAA Collaboration Award

May 2013

ARTD is continuing to champion collaboration in government by sponsoring the NSW Institute of Public Administration Australia’s (IPAA) Collaboration Award again this year. The Collaboration Award recognises individuals who

– have established effective joint working arrangements that bring together colleagues from multiple teams, departments or other organisations

– can clearly identify and communicate the benefits of a collaborative approach

– have delivered improved, coordinated and efficient services.

Our Principal Consultant Sue Leahy presented the award to the 2013 winner, Dion Peita, at a special reception after IPAA’s Special Forum: Reforming to Create Value on Wednesday 8 May. Dion was recognised for his innovative work leading a collaboration between the Australian Museum and the Fairfield Office of Juvenile Justice NSW.

Concerned with the over-representation of Pacific youth in the NSW criminal justice system, in 2008, Dion began exploring the potential for the Australian Museum to help at-risk young people stay out of jail. Juvenile offenders of Pacific background now have the chance to access the Museum’s cultural collections as part of special programs. This has been achieved through collaboration between the Museum and organisations that provide services to at-risk young people. As a result of Dion’s vision, hundreds of young people have gained awareness of their cultural background through handling cultural artefacts and engaging with cultural experts, discussing issues around cultural identity and artistic expression.

ARTDs Indigenous cadet completes summer internship

March 2013

Natalie Ironfield recently completed a three-month cadetship with ARTD through theCareerTrackers Indigenous Internship Program. Our consultants took her through the business of evaluation and research from project design to data collection and analysis.  Michael Combs, Founder and CEO, thanked ARTD for sponsoring the internship and said, ‘It is vital that we continue to break down barriers and encourage reconciliation in Australia and I believe professional employment is one key element in realising this goal.’ Natalie has now returned to the Australian National University in Canberra, where she is completing her studies in International Relations. We hope to continue our relationship with Natalie in the future and wish her well with her studies this year.

Evaluation of the Australian Volunteers for International Development (AVID) program

Aricle Image for Evaluation of the Australian Volunteers for International Development (AVID) program

March 2013

AVID places Australian volunteers with organisations in 37 countries as part of the Australian Government’s foreign aid contribution. Growing volunteer numbers and a recent restructure of the AVID program in AusAID have presented an opportune time for an evaluation, and in 2012 the Office of Development Effectiveness (ODE), AusAID, commissioned ARTD to evaluate AVID to inform future directions. The evaluation, led by Andrew Hawkins, started with a review of international peer-reviewed and grey literature, recently published on the ODEwebsite. Because the scope of the AVID program is so large, ARTD’s work focused on how the program was working in three case study countries: the Solomon Islands, Vietnam and Cambodia. Andrew, with Emily Verstege, Ofir Thaler and ODE representatives, held more than 120 interviews with host organisations, volunteers and AusAID staff during field visits to the case study countries. Other methods included analyses of all articles about AVID published in the Australian media in 2011–12, mapping volunteer assignments in the case study countries against AusAID priorities, analysis of end-of-assignment reports filled out by host organisations and volunteers, and a survey of host organisations. It also drew on a survey of returned volunteers (2006 to 2011) carried out by ORIMA Research. ARTD’s evaluation report will be finalised by June 2013.

We have started the Evaluation of the National Partnership Agreement on Preventive Health

March 2013

The Australian National Preventive Health Agency (ANPHA) has commissioned ARTD Consultants, in partnership with the Prevention Research Collaboration, University of Sydney, to undertake this large two-and–a-half year national evaluation.  The National Partnership Agreement on Preventive Health (NPAPH) is an initiative of the Council of Australian Governments (COAG) and the largest ever national investment in preventive health—it aims to achieve better health for all Australians. The National Evaluation of the NPAPH will be conducted in several stages from January 2013 to June 2015. The evaluation will focus on assessing the benefits of the partnership approach for delivering preventive health initiatives across the country and review evidence of the impact of the NPAPH in all jurisdictions.

Evaluating capacity building and organisational change for achieving health improvement goals

December 2012

Principal Consultant Dr Margaret Thomas recently presented a paper at the European Conference of the International Union for Health Promotion and Education, held in Tallinn, Estonia. Margaret’s paper offered a theory-based approach to evaluating capacity building and organisational change. For her presentation she drew on ARTD evaluation projects where there was a focus on assessing the outcomes of capacity building and organisational change efforts for health improvement goals. See the conference website for more informationl.

Advice on State of the Public Sector in NSW report

December 2012

The Public Sector Employment and Management Act (PSEMA) requires the Public Service Commissioner to present a report on the state of the public sector in NSW to the Premier each year. While other Australian and international jurisdictions have been producing these kinds of reports for some time as an accountability measure, the NSW Public Service Commission (PSC) as a new and independent body, wanted to use their reports as a communication tool and to position the NSW PSC to best achieve their aims in a changing public sector context. Over six weeks, ARTD and our partners (academics from Monash University and visual communications specialists from Equation), worked with the PSC to define the strategic directions for State of the Public Sector reporting in NSW to meet legislative and other objectives and then define the message, structure, content and design of the 2012 report.

ARTD provided the NSW PSC with a work plan that reflected the tight time frames for the 2012 report. The work plan also balanced the objectives of achieving best practice reporting whilst minimising the burden of data collection for agencies and staff.

The report was released in November 2012, titled How it is: state of the NSW public sector report 2012

HCWA Package evaluation findings at the PsychDD conference

November 2012

ARTD evaluated the Department of Families, Housing, Community Services and Indigenous Affairs’ (FaHCSIA) components of the Commonwealth Government’s Helping Children with Autism Package over three years from 2009–2012. The evaluation findings about the Package, which centred on a new mechanism for funding early intervention services for children with autism—$12,000 of funding for families to use with services on an approved panel of providers before the time their child turns seven—provided some important learnings for individualised funding approaches for children with disability and for supporting families. Given this, ARTD and FaHCSIA were asked to present at the PsychDD conference on November 30 in Sydney. The presentation covered the evaluation findings and how FaHCSIA had responded to the suggested refinements. See the Department's website for the summary and full reports from the evaluation and the management response.

Community Builders: first monitoring reports complete

October 2012

ARTD has recently completed the first monitoring reports for the NSW Department of Family and Community Services’ Community Builders program. This program funds community organisations for activities that build community strength and capacity.

Using the monitoring system we developed and piloted with services, we have produced state, regional and individual service level reports.

The state and regional level reports provide an overview of the program’s scope, reach and outcomes. They are an important tool for funders to understand how the program is working and to inform decision making.

The individual reports—all 476 of them—provide each service with data on their service delivery and feedback from clients, along with average scores on performance measures for the region and the state. These reports are a useful resource for services to understand how they are travelling and to consider further improvements to services to reduce community inequality and disadvantage.  

Connecting to Country evaluation

October 2012

Connecting to Country is a new and innovative program for teachers and principals in New South Wales funded by the Commonwealth Government under the Council of Australian Governments’ (COAG) Closing the Gap strategy. It focuses on developing teachers’ and principals’ willingness and ability to establish relationships with Aboriginal students through learning about their cultural, linguistic and family backgrounds, and then using that knowledge to inform classroom practice and pedagogy.

The program begins with a three-day cultural immersion workshop—Being Culturally Aware, Becoming Culturally Inclusive: A Pathway to Cultural Competence—developed by the NSW Aboriginal Education Consultative Group (AECG) and implemented by Regional and Local AECGs. Following this, participants attend a two-day professional learning workshop to build on and strengthen their capacity to plan, develop and implement culturally inclusive programs and school leadership practices.

The NSW Department of Education and Communities has engaged ARTD to evaluate the program’s effectiveness, strengths and challenges and the impact it has had on principals and teachers, Aboriginal students, their families and communities. The evaluation will contribute evidence for the future sustainability of the Program. The evaluation is due to report in March 2013 and the findings will be made available to interested stakeholders including the Commonwealth Department of Education, Employment and Workplace Relations, the NSW Aboriginal Consultative Group Incorporated, the NSW Teachers Federation, the Primary Principals’ Association and the Secondary Principals’ Council.

Evaluation of Lifeline Online Crisis Support Chat

August 2012

In 2011, Lifeline trialled an online crisis support chat service that aimed to replicate Lifeline’s 13 11 44 telephone crisis service in an online environment. We worked with Lifeline to evaluate the trial at the three sites using data from the online counselling system, analysis of chat transcripts and  interviews and focus groups with crisis workers. Lifeline used the results of the evaluation to further develop the Crisis Support Chat Service. And the service was recently launched as a permanent, nationwide service, announced in the Sydney Morning Herald.

Pro bono work for Kool Kids

Aricle Image for Pro bono work for Kool Kids

August 2012

The Kool Kids Club, delivered by Weave, runs free after school and holiday and activity programs in La Perouse and surrounding areas. Two of our consultants supported the Club by developing a program logic describing the outcomes for children, their families and community and evaluating the activities. The evaluation found the club is delivering an age- and culturally-appropriate program of activities consistent with the program logic and it is valued by children, families and the local community We are now mentoring staff to develop and implement tools to collect data on outcomes so the Club can monitor its own work.

Andrew Hawkins presentation on evaluating online services now available online

August 2012

ARTD Senior Consultant Andrew Hawkins’ presentation—an introduction to evaluating websites and online services—to the NSW Australasian Evaluation Society (AES) meeting in Sydney in May 2012—is now available online. Those who missed out but are keen to learn  more about the basic concepts for evaluating websites—search engine optimisation, accessibility, use and useability—and relevant links and resources can find out more here.

ARTD a key contributor to the Australasian Evaluation Society 2012 conference

August 2012

We recognise the importance of keeping up to date with evolving research methods and collaborating with our professional colleagues to share learnings. For this reason, ARTD is again a proud sponsor of this year’s Australasian Evaluation Society Conference, which will be held in Adelaide on 27–31 August.

This year’s conference is about Evaluation in a Changing World. Three of our Consultants will be grappling with the implications of the implications of an evolving context for our evaluation approaches and methods, presenting papers at the conference.

Senior Consultant, Andrew Hawkins will present Evaluation and Government 2.0.

August 2012

One of the most obvious ways in which the world is changing is the degree to which the world’s information is being created, organised and disseminated using the internet. Government sponsored services are also increasingly delivered over the internet. As evaluators are information workers trying to understand government services it follows that evaluators should have some understanding of the internet and what makes for an effective online service. Andrew will discuss what you would need to know if you were asked to evaluate a service delivered partially or wholly over the internet.

Consultant, Narelle Ong will present Health service policy uptake: evaluating and influencing change through an action research approach

August 2012

The presentation will describe an evaluation of a complex mental health policy, which is part of a high-profile national initiative (the National Perinatal Depression Initiative). Narelle will talk about the ARTD team’s experience of using an action research approach to influence change across a large and complex system of approximately two hundred health services.

To find out more about the conference visit the AES 2012 conference website.

ARTD sponsors IPAA Collaboration Award

August 2012

Collaboration is increasingly a key part of effective government responses to addressing complex issues, like homelessness for example. ARTD is delighted to be sponsoring the NSW branch of the Institute of Public Administration Australia’s (IPAA) 2012 Collaboration Award. The Collaboration Award recognises individuals who

  • have established effective joint working arrangements that bring together colleagues from multiple teams, departments or other organisations
  • can clearly identify and communicate the benefits of a collaborative approach
  • have delivered improved, coordinated and efficient services.

Because we know how important collaboration can be to effective service responses in a range of areas, we are proud to be involved in recognising the work of public sector professionals who have achieved excellence in collaboration.

The awards will be presented at a cocktail reception after the IPAA NSW State Conference on Thursday 9 August.

Evaluation of the U-Turns for Youth program

June 2012

Over the last year ARTD has supported Bankstown City Council with their evaluation of the 'U-Turns for Youth' program. This program works with local youth who are at risk of disengaging from school or getting involved in motor vehicle crime, providing them with automotive and life skills workshops to support their engagement in education, training and employment and access to youth support services. ARTD began by working with the project manager to develop an evaluation framework and plan for the ‘U-Turns for Youth’ project. This included data collection methods and tools the Council could use to collect sufficient data for a program evaluation. The Council then commissioned ARTD to analyse the data they collected and write the final evaluation report in June 2011. The report will be used to support and assist other local government areas to implement similar initiatives. 

Margaret Southwell of Bankstown Council had the following to say about working with ARTD: 

“When we developed the idea of running the ‘U-Turns for Youth’ program across Bankstown and Canterbury we knew we needed to collect a broad range of data to assess the impact of the program. ARTD were able to work within our budget and time constraints to guide us in the development of appropriate data collection tools and then mentor our project manager through the process of collecting the data. The final evaluation report was professional, in depth and delivered on time. I look forward to working with ARTD again!”

ARTD Review of Respite Support for Carers of Young People with Severe or Profound Disability

June 2012

ARTD recently evaluated FaHCSIA’s Respite Support for Carers of Young People with Severe or Profound Disability. The program provides immediate and short-term respite to carers of young people with severe or profound disability and facilitates access to information, respite care and other support or assistance. The evaluation found is well targeted and is making a real difference in the lives of carers and their families. Senator Jan McLucas the findings of this evaluation will help inform the design and development work needed for the National Disability Insurance Scheme. For more see this link.

Evaluating websites and online services

April 2012

The internet is increasingly being used to support and deliver social policy and programs by governments around Australia and internationally. Our Senior Consultant Andrew Hawkins has managed several evaluations of online services, including evaluations ofGambling Help Online (Victorian Department of Justice), Lifeline Online CrisisSupport Chat Trial (Lifeline Australia) and the Sage Centre (NSW, Families and Community Services).

Do you need to commission an evaluation of an online service? Have you been asked to deliver one? Would you know which questions to ask to effectively assess the online service and how to answer them?

Following his presentation on evaluating online services at the American Evaluation Association (AEA) annual conference in Los Angeles, California, in November 2011 and his webinar for the AEA in March 2012, Andrew is presenting an introduction to evaluating websites and online services at the NSW Australasian Evaluation Society (AES) meeting in Sydney on 24 May 2012.

The presentation will address the four basic components of a successful website or online service: search engine optimisation, accessibility, use and useability. The demonstration will explain basic concepts, showcase tools, and provide links and resources for those wishing to commission, or try their hand at, evaluating a website.

For more information about Andrew’s presentation, please contact Ben Barnes at the AES on 0423 208 676 or

ARTD review of the School Based Management Pilot published

February 2012

Client: Department of Education and Communities

Timing: August 2011–October 2011

The pilot

Forty seven schools of various types and sizes and located across NSW have been involved in the pilot over two years, concluding at the end of 2011. The School Based Management Pilot allows schools increased flexibility and authority in decisions about their management, including human, material and financial resources. All 47 schools in the pilot are engaged in managing both staff and financial resources in new ways.

The Independent Review

ARTD Consultants was selected to undertake an Independent Public Review of the pilot in August 2011. The review encompassed the 47 schools in the pilot and particularly aimed to obtain the views of principals and to capture their experiences of the pilot. Based on the terms of reference, the review concentrated on four key areas:

  1. Responsibility, authority and decision making
  2. Information and systems
  3. Risk management and accountability
  4. Capacity, capabilities, cultural and organisational change.

ARTD Consultants collected primary data through interviews with members of the School Pilot Oversight Group, a survey of pilot school principals and interviews with 19 principals and a small number of Parents and Citizens association (P&C) representatives. We reviewed data and information available from the internal evaluation of the pilot, conducted a focus group with five Department staff and analysed State Office data on pilot initiatives.

Publication of findings

The findings, conclusions and recommendations of the Independent Review have been published in a full and summary report available on the Department of Education and Communities website.

ARTD developing the evaluation framework for the NPAPH

February 2012

ARTD has recently been awarded the contract to develop an Evaluation Framework for the National Partnership Agreement on Preventive Health (NPAPH) for the Australian National Preventive Health Agency (ANPHA).

We will be working in partnership with colleagues from the Prevention Research Collaboration, University of Sydney. The project will include consulting with government preventive health departments and non-government organisations around Australia. 

Evaluation of online services and websites - AEA 2012

November 2011

Governments are increasingly using the internet to support or deliver social policy and programs and it is important that we as evaluators have the knowledge and tools to effectively evaluate websites or online services. Our Senior Consultant Andrew Hawkins has managed several evaluations of online services, including evaluations of Gambling Help Online (Victorian Department of Justice), Lifeline Online Crisis Support Chat Trial(Lifeline Australia) and the SageCentre (Ageing, Disability and Home Care). Andrew is presenting a paper on the evaluation of websites and online services at the 2011 American Evaluation Association annual conference in Los Angeles, California. His See

ARTD work with Community Builders program

September 2011

ARTD have developed an evaluation framework, strategy and monitoring system for the NSW Department of Family and Community Services Community Builders program, funding community organisations for activities that build community strength and capacity. The Community Builders program has been developed in response to an emergent body of evidence suggesting that community strengthening is an effective way of reducing inequality and disadvantage.

The evaluation framework provides a basis for evaluation and monitoring. It is shaped by the program’s aims and objectives, informed by evidence about constitutes a strong community, and structured by program logic for Community Builders. The evaluation strategy aims to generate an evidence base for the program that can inform decisions by stakeholders at different levels. It sets out important strategic issues, key questions for evaluation, and feasible methods to address them.

ARTD have worked in collaboration with Community Services and the Local Community Services Association (LSCA) in developing the framework and monitoring system. Principal Consultant, Sue Leahy recently co-presented with these partners to the 2001 LCSA conference on the approach to and progress with this work.

Our work with Camp Quality

March 2011

ARTD recently worked with Camp Quality on a major national research project to find out more about the needs of children with cancer, their siblings and parents throughout their cancer journey, and how Camp Quality and other cancer support organisations could better meet these needs.

The project involved focus groups with Camp Quality children with cancer, their siblings and parents at 18 camps across Australia, a survey of Camp Quality families, an online survey of relevant health professionals, and a literature review. Camp Quality is using the findings of the research to inform further development of the programs they deliver to support families around Australia.

ARTD published in the Australian Journal of Primary Health

March 2011

ARTD Consultants Margaret Thomas and Narelle Ong recently worked with the Clinical Education and Training Institute, Rural Division (previously the NSW Institute of Rural Clinical Services and Teaching) to evaluate the Rural Research Capacity Building Program. The NSW Health Framework for Capacity Building—which highlights elements of workforce development, organisational development, resource allocation, leadership and partnership—was used to guide both the development and implementation of the program and the program evaluation. As part of the Institute’s overall evaluation, ARTD interviewed health professionals in the Program, and analysed the data to assess the Program’s effectiveness in improving research capacity among rural health professionals and the impact in their workplaces. The evaluation also identified barriers to research capacity building in rural NSW health services and possible improvements to the Program. The paper they co-authored with our clients, Dr Emma Webster and Ms Linda Cutler, has now been published in the Australian Journal of Primary Health, Volume 17, No 1 2011 p107–113.

ARTD report tabled in Parliament

March 2011

ARTD completed the five-yearly legislative review of the Medicare Provider Number legislation on behalf of the Department of Health and Ageing in 2010. The review involved extensive consultation with the sector, in particular medical colleges, professional bodies and relevant government officers. The review report was tabled in Federal parliament in February and is available on the Department of Health website.

Writing at ARTD

March 2011

We take our writing seriously at ARTD. We know our clients need reports that clearly and concisely answer their key questions. Our staff have trained with Mark Tredinnick, author of The Little Black Book of Business Writing. You can find out more about Mark’s writing and consulting work on his website.

We also have internal processes to ensure the quality of our written documents. ARTD’s Principal Consultants sign off on all reports, and we have an experienced editor on staff.

Assessing cost-effectiveness in evaluation

February 2011

Cost-effectiveness and cost-benefit analysis are an increasingly important part of our work in evaluation. To build on our existing skills (and make sure we’re on top of new methods and issues), one of our Senior Consultants, Andrew Hawkins, recently attended a short course on Advanced Methods of Cost-Effectiveness Analysis run by the Health Economics Research Centre of the Department of Public Health, Oxford University. The course covered both the theoretical concepts and practical exercises in analysing health outcomes data, collecting cost data, discounting costs, dealing with missing data, bootstrapping sample data, constructing decision trees, Markov modelling, and calculating and interpreting incremental cost-effectiveness ratios (ICERs) and the net benefits of health care interventions. You can find out more at