News & Blog

#BeTheChange2017 for dementia

Aricle Image for #BeTheChange2017 for dementia

October 2017

By Director, Jade Maloney, and Consultant, Melanie Darvodelsky 

There are more than 410,000 Australians living with dementia; about 26,000 of whom have younger onset dementia (a diagnosis under the age of 65). By 2025, this is expected to increase to over 530,000. Dementia is the second leading cause of death in Australia and the greatest cause of disability in older Australians.

This is clearly one of the significant policy problems for Australia and the world today – both in terms of finding a cure and creating the systems and the culture that enable people with dementia to live well.

The National Dementia Conference set the challenge to be the change now. The opening keynote Christine Bryden (an author and dementia advocate) called on everyone to be the change in our homes, communities, workplaces and industries.

How? The key learnings from the conference were that listening to people living with dementia, co-designing service models, increasing choice and control, enabling rather than doing for, and harnessing technology are clearly the way forward.

Speakers living with dementia had a strong voice at the conference. Kate Swaffer (CEO and co-founder of Dementia Alliance International) kicked off Day 2 by crushing the misleading myths that people with dementia are ‘empty shells’, can’t communicate and can’t live positively. She showed how people with dementia are driving change around world. Trevor Crosby spoke about living well through acceptance, early diagnosis, positive thinking and getting on with life. While Brace Bateman brought a peashooter to the stage to demonstrate finding ways to be joyful and spoke of the positive reception he had when disclosing his diagnosis.

However, as one audience member with dementia shared, not everyone finds acceptance. Dementia Australia surveys reflect this.

The language we use matters. Both Dennis Frost (an advocate living with dementia) and Dr Sam Davis identified the language around dementia that marginalises and minimises the views of people living with dementia.

On Day 2 audience members with dementia set change in motion in real time with the introduction of a person with dementia to the panel on rights, risk and autonomy. The panel identified a need for a different way of thinking about duty of care and dignity of risk – recognising that duty of care includes listening to and respecting the person with dementia not only risk management. As Madeline Gall (CEO of Lifeview) said, duty of care doesn't mean wrapping a person with dementia in cotton wool the minute they walk into a residence. Providers should understand the individual's risk appetite and respond.

The need to find ways to increase choice also came through in Dr Craig Sinclair’s (University of Western Australia) session on substitute or supported decision-making. He noted that there has been much legal and ethical debate but that this has been missing the voice of people with dementia and that limited practical tools have been developed to support decision-making. Research with people with dementia identified the desire to maintain as much choice as possible, but that decisions are also made in a relational context, with family members.

Carers of people living with dementia spoke about the barriers they experience in supporting their loved ones to live with dignity, and as independently as possible for as long as possible. They called for attitudinal change to enable increased independence and quality care.

Creating Dementia Friendly Communities is one way to be the change as Kate Swaffer and Dr Kathleen Doherty identified. Dementia friendly is “people friendly” and can have benefits for the broader community.

We're proud to be working with Dementia Australia on a developmental evaluation of the national Dementia Friendly Communities program. The program builds on the evidence from existing dementia friendly community initiatives and has been co-designed with people with dementia.

We believe evaluation can help be the change if it truly involves people with dementia. As several conference presentations identified evaluation can help to identify what works, potential improvements and guide future implementation.

How program logic can work for NGOs

Aricle Image for How program logic can work for NGOs

October 2017

By Director Jade Maloney and Adelaide-based Senior Consultant Jane Ford

It might sound academic but program logic is a very practical tool to help strengthen your program design and show how it works. This is important for NGOs competing for government funding as they are increasingly being asked to set out the evidence base for their programs and a plan for evaluation.

On a wet and wild Wednesday in Adelaide, Jade Maloney and Jane Ford from ARTD and our South Australian Associates, Sharon Floyd and David Egege, met with a group of NGOs to discuss how can put program logic to use. These are our top tips. 

  • Develop both a program logic and a theory of change. What's the difference? A program has only one logic - this is in the form of a diagram. But it can have many theories - for the program overall and how you expect to move between activities and outcomes. The theory of change comes in the form of a story.
  • Reality check. Strengthen your program design by asking: What reason do we have to believe what we’re doing will lead to the change we want to see? Don’t be afraid to look at the research literature and social science theories which can provide a logical framework for your program’s activities. If you want to change individual behaviour, consider the theory of planned behaviour (Ajzen & Fishbein, 1980) or stages of change (Prochaska J.O., DiClemente C.C., Norcross J.C. (1992). If you want to create community level change, consider theories relevant to the change you’re aiming to create or theories of community development.
  • Keep the big picture in sight. It’s true that your organisation alone can't be held accountable for high-level policy goals (such as increasing inclusion of people with disability or reducing youth unemployment) because other organisations and environmental factors will also help to shape these outcomes. Some people see this as a reason to exclude high-level goals from your logic. But we suggest you need to include them so you can map out a plausible pathway to contribute to long-term goals, counter factors that will negatively impact on achievements, and take advantage of complementary partnerships.
  • Be specific and think in outcomes. What are the outcomes you are aiming to achieve at each step along the way? What would these look like in real life? If you’re not specific, and think in outputs rather than outcomes, you won’t be able to measure your program’s achievements and make adjustments if needed to stay on track.
  • Refine in real time. The diagram is not the destination. Contexts shift. Programs evolve. Updating your diagram doesn't mean you failed. It means you're learning from implementation and will be better placed to succeed.

Want to get under the hood of how a policy or program really generates outcomes?

Aricle Image for Want to get under the hood of how a policy or program really generates outcomes?

October 2017

Then it’s time to learn more about realist evaluation.

At ARTD, we’ve long seen the value of applying a realist lens to evaluation. We understand the causal power of programs lies in the mechanisms they fire in certain contexts and that different things work for different people.

We think that a realist lens has huge potential to help policy makers better select appropriate approaches for particular contexts, and to adapt their policies and programs to different contexts. But we know it can be difficult to translate promise into practice

That’s why we’re sponsoring this year’s International Conference for Realist Research, Evaluation and Synthesis in Brisbane this October 24–26.

Come along to the pre-conference workshop, ‘Introduction to realist evaluation’, with Director Andrew Hawkins  on Monday 23 October to explore how policies and programs really generate change and what it means to take a realist approach. Or catch Andrew  on ‘Testing realist program theory – quantitative and qualitative impact evaluation’ on Wednesday 25 October (11–11.45am).

If you’re at the conference, come and speak to one of our senior staff. If you can't make it, catch up on conference highlights on Twitter and the ARTD blog.

Adelaide program logic workshops

October 2017

Program logic can help you develop design a program that works and provide the basis for effective monitoring and evaluation. With the growing focus on outcomes measurement. Program logic has a key role to play.

To mark our new presence in Adelaide, we are offering two free program logic ‘taster’ workshops next week. 

Workshop 1

When October 10 from 2–4 pm

Where Intersect, 167 Flinders St, Adelaide

Register using this link.

Workshop 2

When October 11 from 2–4 pm

Where Victim Support Services, 33 Franklin St, Adelaide

Register using this link.

Questions: contact 

Is your program logic logical?

October 2017

By Director, Andrew Hawkins

The term program logic implies the model you develop is logical. But is it? Often the links between outputs, short-, medium- and long-term outcomes in a program logic model look more like a wish list than something that would logically follow from program activities.

We need to put the logic back into program logic so it can help us design programs that work and effectively evaluate programs. To do this, I think evaluators need to do five things

1. Recognise the difference between program logic and theory of change. A program has one logic but can have many theories covering the program overall and different levels of the program logic.  For example, in a parenting program, there may be a theory about why improving certain mothers’ parenting skills in a certain way will lead to better parenting, but there may also be a theory as to why distinct approaches to marketing the program will work differently for mothers depending on their circumstances, motivations, media consumption etc.  

2. Understand and use theories of causality that are useful for different purposes. There are three main theories of causality that evaluators might draw on:

  • Often program logics employ a successionist theory of causality where programs are reduced to a series of cause and effect relationships in a chain. But social programs are more complicated than this suggests.
  • A configurationalist theory of causality is often most useful for program logics in terms of social intervention. This theory of causality recognises that a range of things need to come together in a ‘causal package’ to bring about a change. It’s like baking a cake: for success, you need the right combination of ingredients, mixed in the right way, and placed in the right context (i.e. an oven at the right temperature).
  • If you want to understand how and why change happens, you are best off with a generative theory of causality, as a realist would use. A generative theory of causality explains the underlying mechanism of change, for example, how the ’rising agent’ in the cake works. You wouldn’t generally represent this in the program logic model, but would describe it in a narrative, table or other figure that sits alongside the program logic.

3. See programs as arguments with premises and a conclusion. A program seen through the lens of informal logic is an argument. The premises in the argument are the outputs of program activities (as well as our assumptions), and the conclusion is the intended immediate or short-term outcome. In a good argument, if all the premises were true and assumptions held, the short-term intended outcomes would follow with a high degree of probability. Theory can provide an underlying ‘reason to think this will work’. This is a special case of the broader class of ‘warrants’ that provide reasons to accept an argument about a course of action.

4. Evaluate the logic of program design, implementation and outcomes. This may require three stages:

  • Does it make sense on paper? If all the necessary conditions were achieved and our assumptions held, would these be sufficient to achieve our immediate- or short-term intended outcomes?
  • Was it effective? Seek evidence to empirically test premises to determine if the argument is well-grounded. Did these conditions together form a causal package that was sufficient to achieve our intended immediate- or short-term outcome?
  • Was it efficient? Was each condition actually necessary, or could the program be streamlined?

The empirical evidence generated by evaluation methods can be used to refine the logic and theories. This can inform not only further implementation of the program under study, but the design and implementation of future programs of a similar nature.

5. Be realistic about what evaluation can usefully measure. We know that programs and policies are not always fully developed when they are implemented. It is also common that a program will be sufficient for an immediate- or short-term outcome, but will only contribute to longer-term outcomes alongside external factors. Taking a logical approach, we can conduct a very cost-effective evaluation by focusing on the most important parts of a program, given the maturity and current state of knowledge about the value of the program. This will ensure we really do generate evidence and insight to inform decision-making.


Weave wins mental health award

Aricle Image for Weave wins mental health award

September 2017

Last night, Weave Youth and Community Services won the Aboriginal Social and Emotional Wellbeing Award at the Mental Health Matters Awards for its project Stories of Lived Experience. This included a photography exhibition, a documentary and an evaluation.

In 2015, ARTD developed the qualitative evaluation. We spoke with four generations of families and over 50 clients that have been connected to Weave since its beginning in 1975 to understand what is most useful about how Weave works; the difference that Weave makes for clients and the community, and how Weave and the sector can improve its support in this area. Thank you to all the participants and stakeholders for entrusting us with their views and experiences to form such a great evaluation project.

The Mental Health Matters awards are held by WayAhead (the Mental Health Association of NSW). The Aboriginal Social and Emotional Wellbeing Award recognises programs, projects, people or initiatives that aspire to and enhance the social and emotional wellbeing of Aboriginal communities in NSW. 

Brick walls or byways to evaluation use

Aricle Image for Brick walls or byways to evaluation use

September 2017

By Director, Jade Maloney

Last week we looked at how Australasian Evaluation Society members have successfully overcome some of the obstacles they have encountered to use. This week we look at main brick walls my research with Australasian Evaluation Society identified to evaluation use, and some byways around them.

Politics: Unsurprisingly, given the intertwining of politics and policy, ‘politics was the most commonly identified brick wall to use of evaluation findings. However, some had found byways around political roadblocks. These were:

  • keeping the report in the bottom drawer until the context is right
  • arming communities with evaluation findings so they can encourage governments to act or to negotiate alternative sources of funding when required
  • using the evaluation to speak truth to power if the internal context is right
  • focusing on conceptual instead of instrumental use – the knowledge that can be applied in other projects.

Resourcing: Linked to the political roadblock, lack of resourcing was also commonly identified as a brick to use of evaluation findings. When governments change, priorities can also change and resources become unavailable. But lack of resourcing can also limit the take up of findings from pilot program evaluations. However, by equipping communities with evaluation findings they can attract other sources of funding.

Timing: Some evaluators reported that delays between a report being written and it being released can limit the relevance of findings for use. In contexts with staff on short-term contracts or on rotation, staff can have moved on, and the project have ended by the time findings are available.

More broadly, limited timeframes for evaluation can limit the type of questions that can be answered by an evaluation and, thereby, the ways in which it can be used. In this case, it was important to settle on reasonable indicators.

Cross-agency collaboration: Two evaluators involved in cross-agency initiatives said that the blurred lines of responsibility in these contexts could create roadblocks to evaluation use. Cross-agency response systems are required to address this.

Dissemination of findings: Lastly, some evaluators identified a lack of processes for broadly disseminating evaluation findings as a barrier to use. This is particularly important given evidence of the need for an accumulation of evaluation evidence before change occurs. However, the increase in policies requiring government agencies to publish findings is shifting this roadblock (although this may have its own implications for the type of learnings that get reported). Additionally, one internal evaluator said their agency had established an internal ‘lessons learned’ register to support broader use of findings; while another had established evaluator networks to share learnings. Byways unidentified by external evaluators included presenting at conferences, specifically identifying broader learnings for policy design and delivery, and negotiating with clients to share general learnings from evaluations with other clients. 

Overcoming obstacles to evaluation use

Aricle Image for Overcoming obstacles to evaluation use

September 2017

By Director, Jade Maloney

An evaluator might weep at the plethora of literature that shows their work going unused. But my recent research with Australasian Evaluation Society members shows evaluators have had success in overcoming at least some of the obstacles they have encountered to use.

Disinterest in, or more active resistance to, evaluation: Some evaluators I interviewed had been able to overcome this by: selling the value of evaluation; engaging stakeholders where they are at and from a ‘what’s in it for me?’ perspective; and involving stakeholders in a dialogue as data is collected to answer their questions.

Fear of judgement: In some cases, resistance to evaluation was related to a fear of being judged and a misconception about evaluation. Some evaluators had overcome this obstacle by convincing stakeholders they were there to help them with their work, using their interpersonal skills to make connections with stakeholders.

In some cases, fear related to evaluation construed as an accountability or compliance mechanism rather than a learning exercise. To address this, some evaluators emphasised learning and strengths-based approaches. Others, however, suggested a need to combine accountability and learning purposes or saw accountability in a more positive light.

Lack of purpose: when they encountered evaluations undertaken as a ‘tick-a-box’ compliance activity or without a clear purpose, some evaluators had been able to help stakeholders find a purpose. They helped them work through the questions they wanted to ask, and who would use the evaluation.

Negative findings: Given evidence of the human tendency to accept information that confirms our preconceptions and to refute information that challenges them (Oswald & Grosjean, 2004), it is unsurprising that evaluators had encountered more resistance to negative findings (and positive findings about programs that were slated to be discontinued). To overcome this obstacle, evaluators prepared stakeholders for what they might not expect from the outset, surfacing biases that needed to be overcome. When negative findings eventuated, they shifted resistance by socialising findings as they emerged, use the positive sandwich approach to frame findings and/or focus on lessons learned and solutions, as identified in the following quote. However, it was noted that these strategies can fail when working with organisations that lack a learning culture and when findings are politically unpalatable.

Next time, a look at the common brick walls to evaluation use and whether evaluators have alternative routes around them.

Sue Leahy joins AES Board

Aricle Image for Sue Leahy joins AES Board

September 2017

Our Managing Principal Consultant, Sue Leahy, has joined the ranks of the Australasian Evaluation Society (AES) Board.

Sue is looking forward to working with other Board members to strengthen the role of evaluation in Australasia and shape an AES that continues to meet the needs of its members for professional growth and networking.

It’s an exciting time for the AES, as we contemplate appropriate pathways to professionalisation of evaluation and continue to develop the interest and capacity for evaluation among Indigenous people. 

Congratulations Lex Ellinson – AES Emerging New Talent!

Aricle Image for Congratulations Lex Ellinson – AES Emerging New Talent!

September 2017

Senior Consultant, Alexandra (Lex) Ellinson, was awarded the Emerging New Talent at the Australasian Evaluation Society International Evaluation Conference.

The Awards Committee praised the scale of what she has achieved in just four years as well as her commitment to the profession.

Lex has designed and managed large-scale long-term evaluations of youth policy and social housing and worked in partnership with Indigenous communities. Her approach is informed and pragmatic – she has managed projects with quasi-experimental, realist-informed and co-design approaches to suit different contexts and different purposes.  

As Secretary of the AES’ Advocacy and Alliances Committee, she works to promote the use of evaluation and evaluative thinking by Australian agencies and organisations. She is also a member of the Realist Special Interest Group.

Lex discovered evaluation as a profession while studying philosophy and social sciences at university in the early 2000s, and sensed that, “evaluation nicely brings together my academic interest in the rational grounds for making normative judgements based on empirical data, which is a longstanding philosophical question.”

Accepting the award, Lex couldn’t help but demonstrate the reflective thinking that we know and love. She reflected on the inquisitive minds and commitment to doing good that she’d encountered at the conference and her excitement at being part of a profession capable of having such wide-ranging conversations.  At Gill Westhorp’s pre-conference workshop, they’d discussed ‘everything from the ontology of a tea cup to patronage systems in Laos’. 

Congratulations Lex. We’re sure there are more great things to come!

Can evaluation live up to its potential?

Aricle Image for Can evaluation live up to its potential?

September 2017

On a chilly Canberra morning, Sandra Mathison lit a fire under few hundred evaluators at the Australasian Evaluation Society (AES) International Evaluation Conference.

“Evaluation is a helping profession,” she told us. But it has long underperformed on its promise to support the public good because it is constrained by dominant ideologies, is a service for those with resources, and works in closed systems that tend to maintain the status quo.

Mathison challenged us to think about how we could turn this around, suggesting two ways forward. The first was ensuring those who are meant to benefit from a program have input into evaluation and access to the findings so we are ‘speaking truth to the powerless’ (No that’s not a typo, the more clichéd ‘speaking truth to the power’, she said was ‘a valiant but often futile act’). The second was more ‘independent’ evaluation – that is, evaluation independent of funding agencies and perhaps even independent of individual programs in ‘place-based’ evaluation. (Intrigued? Read about the rest of her speech in Stephen Easton’s piece in The Mandarin.)

On Day 2 Richard Weston from the CEO of the Healing Foundation took up the torch and challenged us to reflect on how evaluation could recognise and value Indigenous knowledges and collective as well as individual outcomes. And on Day 3 Dugan Fraser told us that we could conceptualise evaluation, not as assessing efficiency and effectiveness, but as supporting democracy – as a means of empowering citizens to interact with power and ask questions.

The concept of ‘speaking truth to the powerless’ clearly cut across the conference sessions and resonated with the audience. Less certain was the response to Mathison’s call for more ‘independent’ evaluation. One member of the audience asked Mathison whether we should not instead be focused on democratising evaluation.

Certainly, what Mathison was proposing is not inconsistent with participatory approaches and design thinking (another big theme of the conference). But it does seem somewhat at odds with Patton’s call in Utilisation-Focused Evaluation to engage all key stakeholders in the evaluation process.

In their session on policy logic, Carolyn Page and Russell Ayres set out the value of bringing all levels of expertise to the table. Our Director Jade Maloney highlighted the importance of engaging stakeholders, forging relationships and communicating well to support use – findings from her research into evaluation use in the Australian context.

Others identified ways to hardwire evaluation into government infrastructure. Economist, Nicholas Gruen championed the idea of an Evaluator General. In another session, the Commonwealth PGPA Act 2013 was identified as providing an opportunity if evaluation can take up the challenge – that is, if evaluation can help agencies understand their performance, alongside monitoring and KPIs. This will require better integration across evaluations.

Thankfully, the arguments about hierarchies of evidence that dominated conference discussions a decade ago seem to have fallen by the wayside in the wake of more nuanced discussions about what constitutes good evidence. At the conference dinner, Nicholas Gruen set the challenge for evaluation to ask the right questions and noted that randomised control trials (RCTs) are not the panacea they’re made out to be. (Yes an economist said this!). The next morning ARTD Director Andrew Hawkins extended this idea by outlining a framework for choosing different methods depending on the causal questions being asked and the resources available.

All this, and we haven’t even mentioned how realist evaluator,, Gill Westhorp, gave us five different ways to think about classifying mechanisms and prompted us to think more about how we use theory in evaluation at all levels. But there’ll be time for that at the International Conference for Realist Research, Evaluation and Synthesis in October.

So we’ll end with the themes that will stick with us. We need to work meaningfully with beneficiaries, trial and fail fast, share and build on learnings from success and failure, break down silos and engage in cross-disciplinary conversations. If we do this, we believe evaluation will live up to its promise.

Let’s continue the conversation through AES network meetings until we all get together again next year!


aes17 International Evaluation Conference

August 2017

ARTD is looking forward to discussing how to make evaluation a more useful and effective governance tool with other members of the Australian Evaluation Society (AES) next week at the aes17 International Conference in Canberra.

ARTD is again a sponsor of the event, which runs next week 4-6 September at the National Convention Centre. This year we’re sponsoring keynote speaker Sandra Mathison, who will examine whether evaluation contributes to the public good. For more information about the conference head to:

This year, two of ARTD’s directors are also presenting.

Andrew Hawkins’ presentation ‘Intervention: putting the logic back into logic models’ (Tuesday 5 Sep, 9.30am, Murray Room) will examine the shortcomings of logic models and challenge practitioners to better explain causal mechanisms. Click here for more information on Andrew’s presentation.

Jade Maloney’s presentation ‘Evaluation: what’s the use?’ (Tuesday 5 Sep, 3pm, Torrens Room) will highlight findings from her new research, which examines the problem of non-use and how practitioners can ensure their evaluations don't end up gathering dust on a shelf. Click here for more information on Jade’s presentation.

If you are at the conference, please come and speak to one of our senior staff.

Increasing access and inclusion in Fairfield

Aricle Image for Increasing access and inclusion in Fairfield

August 2017

On Monday 7 August, Fairfield City Council launched their disability inclusion action plan. The plan details what Council will do to encourage positive community attitudes and behaviours towards people with disability, make it easier to get around the community, support access to meaningful employment opportunities and improve access to Council information, services and facilities.

The launch represented the completion of all local council DIAPs in NSW, required under the NSW Disability Inclusion Act 2014. It was also a valuable opportunity to connect the local community with disability service providers, with Council hosting information stalls for local NDIS-registered providers.

Fairfield City Council’s Mayor launched the Plan. The launch was also attended by the Minister for Disability Services, Ray Williams, who explained that disability inclusion action planning is both “unique and unprecedented in Australia. It means that State and Local Governments are working together to ensure people with disability can fully enjoy the opportunities available in our communities.”

ARTD worked with Fairfield City Council to develop their DIAP, using a co-design approach grounded in the principle of ‘nothing about us without us’. We consulted people with disability, their family and carers and local service providers to understand the barriers people face when interacting with Council and getting around the community, and what they saw as the priorities.

To ensure the consultation process was accessible and inclusive, we used accessible venues with hearing loops installed, and drew on bilingual educators to conduct consultations in local community languages. We also worked with our partner, the Information Access Group, to produce an Easy Read consultation guide, as well as the final DIAP in Easy Read format.

We then worked with Council staff and managers to develop a realistic plan of action for the next four years and indicators to track progress. 

You can access Fairfield City’s DIAP on the Council website.  

[Photo by Fairfield City Council, featuring Mayor, Frank Carbone; Fairfield City Leisure Centres trainer Fred Zhao; and the Hon Ray Williams MP]

Automation set to transform the government service landscape – 2017 IPAA Conference

July 2017

A new round of digital transformation and automation is about to transform the government workforce and the way government delivers services.

The 2017 IPAA conference – of which ARTD was a key sponsor – emphasised the need for the government sector to think creatively about how digital technologies will work for staff and clients.

ARTD Managing Principal Consultant, Sue Leahy, says that the clear message at the conference was that while the focus tends to be on the technology, managing digital change starts with understanding the people issues.

Managing change across the sector will mean understanding how digital technologies can be used to improve the workplace or client experience.

We need to put people at the centre of digital transformation. That’s where innovative models of engagement will come in – to ensure we deeply understand what people need from government services.

Artificial intelligence (AI) technology is also advancing rapidly and will soon transform the way we work in government, reducing transactional, information sifting type work and creating new roles. AI technology has the power to reduce paperwork and change the way we deliver services but it will also require active change management as people’s roles are disrupted. New jobs such as robotic orchestration, robot teachers, empathy trainers, business ethics (reviewing inputs and outputs), and privacy guardians will be created.

To ensure that we get the best out of the technology on offer, it will be important that we start with a deep understanding of the human aspects of the problems technological change aims to solve.

Co-design creatively boosts access to primary health services for Aboriginal families

July 2017

How do you make Victoria’s Maternal and Child primary health services more accessible and culturally appropriate for Aboriginal families?

That’s the question ARTD, the Victorian Government and Aboriginal people have been grappling with in an innovative co-design project aimed at understanding the barriers that prevent Aboriginal families from using the universal primary health services.

As part of the Victorian Government’s $1.6 million program, Roadmap for Reform: Strong Families, Safe Children, ARTD was commissioned to provide a way for Aboriginal communities to have a voice in designing the service models they need.

The co-design process recognises that Aboriginal people bring experience and expertise about accessing services to the table. It reflects a new approach to designing services for people rather than asking them to fit the mould.

ARTD used a range of narrative-based techniques to deeply understand the needs of the target group before collaboratively generating ideas with stakeholders about the best ways of meeting those needs.

The process produced a service model to support self-determination and deliver high quality, coordinated and culturally safe Maternal and Child primary health services for Aboriginal families that will be trialled in local government, Aboriginal Community Controlled Organisations (ACCOs) and in integrated partnerships. 

Digital disruption and the IPAA NSW State Conference

June 2017

Digital technologies are continuing to transform the way that governments operate and how they interact with citizens and clients of services. Increased mobility, connectivity and data storage capacity provide opportunities to improve services and productivity.

We work with government agencies aiming to harness these opportunities. We can support agencies to ensure their system design is user-centred and accessible to people of all abilities. We can also assist agencies to engage with their clients and citizens through secure online platforms with the ability to tailor communications and surveys to particular target groups. We have also designed an innovative app for agencies to monitor and capture their client’s important life outcomes.  

To support digital transformation in NSW, ARTD is sponsoring the IPAA NSW 2017 State Conference on 15 June. This year’s theme is ‘Innovating, Emulating and Enabling: How the digital transformation is changing public sector skill sets.’ More than 20 experts will speak on topics such as data security, big data, and how the NSW Public Sector is currently embracing digital technologies. For more details about the IPAA and the Conference, visit:

If you’re at the conference come and speak to one of our staff at the coffee stand.

Find out more about the projects being funded to support the transition to the NDIS

April 2017

Wondering how people with disability and their families, as well as service providers are being supported to make the transition to the National Disability Insurance Scheme? What about initiatives to grow the market and develop innovative support options that respond to the choices of people with disability?

The NDIA, the Commonwealth and State and Territory Governments are undertaking a range of activities to support the transition. A Sector Development Fund was also established to fund activities that support the transition between 2012–13 and 2017–18. Combined, these activities will assist in the establishment of a flourishing disability support market that offers people with disability choice and control.

To date, a range of initiatives supporting people with disability and their families, providers, and workforce and market growth have been funded through the Sector Development Fund. You can find out more about what these activities are and access the resources they have produced through ARTD’s profiles of the funded projects now on the NDIS website  

ARTD’s Independent Review of Special Religious Education (SRE) and Special Education in Ethics (SEE) in NSW has been released

April 2017

In 2014, the NSW Department of Education commissioned an independent review of the implementation of SRE and SEE classes in NSW Government schools ‘to examine the implementation of SRE and SEE and report on the performance of the Department, schools and providers’. The Review was commissioned in response to Recommendation 14 of the Legislative Council General Response Standing Committee No. 2: Report No. 38 Education Amendment (Ethics Classes Repeal) Bill 2011 (May 2012) which also specified areas for the review to cover. These became the basis of the Terms of Reference:

 1.The nature and extent of SRE and SEE

2. Department of Education implementation procedures for SRE and SEE including: parent/ carer choice through the enrolment process and opting out; approval of SRE and SEE providers by DoE; authorisation of volunteer teachers and curriculum by providers

3. Development of complaints procedures and protocols

4. SRE and SEE providers’ training structures

5. Registration of SRE and SEE Boards, Associations and Committees

6. New modes and patterns of delivery using technology

7. Pedagogy, relevance, age appropriateness of teaching and learning across all Years K to 10 and teaching and learning in SEE in Years K to 6 in a variety of demographics

8. The need for annual confirmation by parents and caregivers on SRE choice or opting out

9. Review of activities and level of supervision for students who do not attend SRE or SEE.

 The Review examined the implementation of SRE and SEE in NSW Government schools between December 2014 and September 2015. This report outlines findings related to each Term of Reference and makes recommendations. Because SRE and SEE are quite distinct, they are dealt with separately throughout this report. In the current context, there are polarised views in the community about the place of SRE or SEE in NSW Government schools. While the continuation of SRE or SEE in NSW Government schools is out of scope of this Review, this was a concern for many people and influenced responses to the Review.

 Review methodology

 The Review used a comprehensive mix of methods to collect quantitative data across all schools, and the wider community, as well as in-depth and qualitative data from key stakeholders. The methods were chosen to allow all interested stakeholders and the community the opportunity to present their views so that the findings and recommendations are based on a systematic and balanced assessment. Evidence was reviewed and data collected between December 2014 and September 2015.

 The main methods for the Review were:

 ▪ Document scan. Departmental and provider documents/ websites were reviewed, including the 2014 and 2015 SRE and SEE policy and implementation procedures, and the websites of all current providers in December 2014 for their SRE or SEE curriculum scope and sequence documents and outlines.

 ▪ Curriculum review. An experienced education expert conducted a systematic criterion-based assessment of curriculum materials, based on materials from current SRE providers and the current SEE provider.

 ▪ Consultations

Surveys and interviews. Systematic data were collected via surveys of key stakeholder groups. Opportunity to respond was offered to all principals (46% response rate), all SRE and SEE providers (80% response rate), all providers’ SRE coordinators (60% response rate) and all SEE coordinators (48% response rate). SRE and SEE teachers contributed via an online portal. These data were complemented by semi-structured interviews with members of the program evaluation reference group, and with peak provider, education and other relevant groups.

Cases studies. To examine how SRE and SEE is delivered in schools at the local level, the Reviewers undertook 14 case studies involving 12 SRE providers from 11 faith groups; and two case studies of the delivery of SEE. The case studies used faceto-face interviews with coordinators, teachers, principals, and other stakeholders. They were effective in telling the story of local delivery in very different contexts.

Online community consultation. To collect perspectives from the broader community under the Terms of Reference, online contribution portals for parents/ caregivers; and other interested parties were set up and accessible for six months. The Review received over 10,000 responses, reflecting the high level of interest in sections of the community. The Reviewers recognise that while the responses reflect significant issues for those who responded, to some degree they reflect the two polarised positions in the community around SRE and SEE, and cannot be considered as representative of the whole NSW community. Indeed, the Reviewers are aware that some groups were active in encouraging their constituents to contribute, and in some cases suggested wording.

 Confidence in the findings

Overall, the Reviewers are confident that the findings from these methods reflect the broad patterns of implementation of SRE and SEE and provide a sound basis for addressing the Terms of Reference and making suitable recommendations. The methods were implemented effectively and there was a high degree of consistency between the wider findings from the surveys; the interviews/ group discussions with significant stakeholders; and the on-ground findings from the local case studies. The data from the online contribution portals is less balanced and has been used with caution, but it is generally not inconsistent with the other methods, and has been useful in raising issues.


The Review made fifty-six recommendations, based on the evidence presented in the report.

 You can access the full report on the Department of Education website using this link.

 The release of the Review and the NSW Government’s response was reported in The Guardian, The Sydney Morning Herald, The Australian and the Daily Telegraph.

Youth Frontiers evaluation released

Aricle Image for Youth Frontiers evaluation released

April 2017

The findings of our evaluation of Youth Frontiers—a NSW-wide mentoring program—have been released by the Youth Participation & Strategy Unit in the NSW Department of Family and Community Services (FACS). Overall, the evaluation found the program had an impressive reach across NSW and was achieving positive developmental outcomes for young people. However, it is not making a direct contribution to increasing community participation, so there are some opportunities to strengthen the program’s design.

The report is being used inform implementation of the program in 2017 and strategic decisions about the program mix that will support FACS' priorities for youth development in NSW. These priorities are set out in the NSW Strategic Plan for Children and Young People.

This report follows on from our evaluation of the initial pilot of the program, the findings of which were used to inform the ongoing roll-out of the program by the four funded providers delivering mentoring to students in Years 8 and 9.

ARTD is continuing to support successful implementation of the program through a monitoring system that includes post-program surveys to mentors and mentees and reports on these.

Planning with people with disability

March 2017

Want to get the voice of people with disability in policy planning or program evaluation, but unsure how to go about it?

In NSW, government agencies and local councils need to develop disability inclusion action plans (DIAPs) with people with disability. More broadly, as policy co-design catches on, government agencies need to make sure consultation processes are accessible and inclusive.

A commitment to inclusion and attention to access upfront and along the way will enable people with disability to shape policies that affect their lives and communities.

When our Principal Consultant, Jade Maloney, worked with Mosman Council on their DIAP, both organisations had a clear commitment to making the process and the final product accessible and inclusive. This meant using accessible venues and working with our partner, the Information Access Group, to produce an Easy Read consultation guide, as well as converting the final Mosman DIAP into Easy Read.

Easy Read is a way of presenting information in a way that is easy to understand. In its simplest form, it uses images to support text, a large font size and plenty of white space to increase readability. It makes documents more accessible for people with intellectual disability, people with literacy issue and people who speak English as a second language.

“We’re committed to ‘nothing about us without us’”, Principal Consultant, Jade Maloney, says. “But we’re also conscious that it can be very difficult for people with disability to even get into community consultations, let alone share their views. So we need to make sure we provide information in accessible formats, use venues and formats that are accessible, give people time and different options to share their views.

“In recent years, we’ve worked with the Information Access Group on a range of policy development and evaluation projects. We’ve found the Easy Read versions have worked well to break down complex concepts and give people with intellectual disability the space to share their views.

“When we ran the public consultation sessions for the National Disability Insurance Scheme (NDIS) Quality and Safeguard Framework we also developed a process to make sure we’d covered off on all access requirements—from physical access for people with limited mobility, hearing loops and closed captioning for people with hearing impairments and Auslan interpreters for those who use sign language.

“We’re currently putting this experience into practice again working with Fairfield City Council to develop their DIAP."

You can see what actions Mosman Council is taking to increase inclusion and job opportunities for people with disability and make it easier for people to get around in their community in their DIAP

Devil in the detail: Why ethics processes for evaluation must be revisited

March 2017

Formal ethics approval processes designed to protect study participants are often causing their own unintended harms in the context of program evaluation.

This is the somewhat controversial message, Principal Consultants Sue Leahy and Jade Maloney, gave to a full Australasian Evaluation Society (NSW) forum, hot on the heels of their presentation at last year’s AES conference in Perth.

They don’t deny that ethical practice is critical to evaluation, just that formal ethics approval processes are always needed or always result in an ethical evaluation. 

“It’s our experience that the rules around ethics approval are often applied differently across government and that the requirement for external approval can mean the voice of the ’vulnerable’ people accessing the service is not heard in the evaluation because of the timeframes or because the consent process is overwhelming,” Jade says.

When it comes to consulting with Aboriginal communities in NSW, it gets even more difficult.

“While the principles around ensuring Aboriginal communities have ownership of the research in their communities are right,” Sue says, “We question the characterisation of all Aboriginal people as vulnerable. We also find the timeframes for approval through the Aboriginal Health and Medical Research Council can mean that consultation with Aboriginal communities is squeezed out of the evaluation timeframe.

The main problem seems to be that ethics processes were set up to approve clinical trials, but evaluations are not interventions. That’s why ARTD is driving a discussion in the evaluation community about options to improve the process.

AES members at the forum recognised that ethics approval processes bring significant benefits to the evaluation process, but they agreed change is needed. Over the coming months, ARTD hopes to explore options with colleagues, and to work on guidelines to help governments identify when they need formal ethics approval processes.  

Capturing outcomes with Caretakers Cottage

February 2017

Through the Expert Advice Exchange (EAX), ARTD have worked witrih Caretakers Cottage to capture the outcomes of their transitional accommodation service, Options Youth Housing.

The EAX, an initiative of the NSW Government’s Office of Social Impact Investment, connects non-government organisations and social enterprises with leading legal and professional services firms, financial institutions and other companies to provide pro bono on growing and sustaining their impact.

When Caretakers Cottage first came to us, they said they needed some support with strategic planning. But after our Managing Pncipal Consultant Sue Leahy and Senior Consultant Alexandra Ellinson talked with them further and did a detailed SWOT analysis with the team, we realised that what they needed was to capture their outcomes. So we worked with them on a tool to capture client wellbeing in seven domains, from living skills and education to accommodation, family connections, health, safety and wellbeing. By applying the tool at entry, every three months during the service, and after leaving, caseworkers systematically collect information to assess and document changes. Caseworkers can also use the assessment as an opportunity to have supportive conversations with clients about their wellbeing, and to develop a shared understanding of their progress.

Caretakers Cottage was concerned about not creating more work for the sake of doing more work. But they have found that the tool has helped them to better monitor their work and to focus on what the client needs. As well as increasing the staff’s capacity to consistently assess client wellbeing, it really energised the staff because they are now collecting evidence that shows the impact of their work. 

NDIS Quality and Safeguards Framework released

February 2017

The Framework sets out a nationally consistent approach to quality and safeguards for the NDIS to be introduced in 2018–19. This is needed to ensure that capability is built in the new market-based system, the rights of people with disability are upheld, the benefits of the NDIS are realised, and that people have access to the same safeguards no matter where they live.

The Framework is aligned with the NDIS principle of choice and control and takes a risk based approach. It consists of developmental, preventative and corrective measures targeted at individuals, the workforce and providers. Key components are:

  • an NDIS Registrar to manage the NDIS practice standards and certification scheme, establish policy settings for worker screening, register providers, oversee provider compliance, and monitor the effectiveness of the NDIS market
  • an NDIS Complaints Commissioner to facilitate the resolution of complaints about providers of NDIS-funded supports and investigate serious incident reports and potential breaches of NDIS code of conduct
  • a Senior Practitioner to oversee approved behaviour support practitioners and providers, provide best practice advice, review the use of restrictive practices and follow up on serious incidents related to unmet behaviour support needs.

ARTD worked with DSS, State and Territory Governments and the NDIA to develop the framework. The voice of people with disability, their family members and carers, service providers, advocacy groups and representatives of professional organisations captured through public consultations and submissions informed the framework design.

You can access the framework here You can also see our report on the consultations here

Governments have indicated there will also be further opportunities to contribute to the Framework in the design and implementation phases.

Broadening your evidence cost-effectively

February 2017

ARTD Director, Andrew Hawkins, is speaking at the Strengthening Evidence-Based Policy Improving program outcomes and success conference in Canberra on 21 March 2017. He joins other thought leaders in policy, research and evaluation to discuss how sound evidence can be used to improve policy.

Andrew’s presentation will draw implications from the academic research to meet the needs of policy makers for cost-effective monitoring and evaluation of interventions into complex systems. He will challenge the audience to move away from a set of assumptions about measuring outcomes that are useful when evaluating relatively simple things—like fertilisers, drugs and bridges—towards methods that are more appropriate when dealing with new or developing interventions in very complex systems. Cutting through the complexity, he will provide a step-by-step guide for monitoring and evaluation investment decisions that are appropriate to the nature of the intervention, the questions being asked, and the time and resources available.

Early bird registrations for the two-day conference close on 10 February.