Automating NDIS support planning can dehumanise and harm people living with disability
School of Social Sciences
School of Social Sciences
Seven Australians share their journeys in a documentary uncovering the human cost of algorithmic decision-making in social services.
Relying on algorithms to assist support planning can cause significant harm to people living with disability, says Dr Georgia van Toorn from UNSW’s School of Social Sciences. Dr van Toorn co-produced the documentary I Am Not A Number that explores the lived experiences of people whose access to vital funding and support is being shaped by algorithms.
“Australia aims to be a world leader with a whole-of-government approach to digital innovation,” the political sociologist says.
“Currently, people with disability are at the frontline of this transformation, but protections and accountability are lagging behind. Without accurate ways to capture complex needs and lived experiences of disability, algorithms are not ready to guide critical decisions.”
Dr van Toorn is an Associate Investigator at the ARC Centre of Excellence for Automated Decision-Making and Society. She researches the impact of data analytics and algorithmic decision-making in the public sector, with a focus on social justice implications and community responses.
I Am Not A Number, directed by Jeni Lee, explores seven people’s experiences of the National Disability Insurance Scheme (NDIS). The NDIS funds “reasonable and necessary” disability support and services to help people with a permanent or significant disability achieve their goals.
The scheme was intended to deliver tailored, participant-led support, but the reality is a different story, Dr van Toorn says.
“The NDIS uses data to profile and predict people’s needs, but this kind of profiling strips away nuance, over-simplifying and neglecting critical dimensions of people’s lives and their experience of disability.”
Disabled people are a diverse group; living with a disability often involves managing unique and complex needs that do not fit standard systems or templates, she says.
Erin McGrath joined the NDIS after a significant multiple sclerosis attack that brought on secondary narcolepsy. “I am also autistic, I have ADHD, and I have complex post-traumatic stress disorder,” she says.
“The NDIA [National Disability Insurance Agency that runs the scheme] see a diagnosis, and that’s all they see. They don’t see what that diagnosis does to people … Algorithms don’t see it either because AI is not intelligent enough.”
Paris Zarmairian, an NDIS participant and support coordinator, has seen many autistic people receive an almost identical support plan, which she attributes to an algorithm.
The inflexibility of algorithmic decision-making combined with high workloads and limited training and support mean both the human and machine elements of the NDIS can contribute to a lack of personalisation, says Dr van Toorn.
“The NDIS is founded on personalisation. While standardisation has a role, algorithms should not be trialled in ways that put critical supports at risk. We should have learnt that lesson from Robodebt.”
Dr van Toorn says: “Tech design is not just about coding and getting the engineering right; it's about having meaningful engagement with people who are going to be experiencing these systems.”
I Am Not A Number provides a platform to promote the expertise, lived experience and advocacy work of people living with disability, she says.
Various online networks and advocacy efforts have emerged in response to the government’s use of algorithms, such as the RoboNDIS campaign for a class action to hold the government to account.
Mark Toomey, a global IT governance expert, started the campaign as a Facebook group in 2022. Today it has more than 700 members. His 44-year-old son Geoff Toomey suffered a brain haemorrhage in 2015 and now lives with a severe acquired brain injury.
Their journey with the NDIS has been unbelievably challenging all the way through, Mr Toomey says. “The NDIS in reality, what it’s doing is causing more stress, more distress, and more waste of money than you can ever imagine.”
Marie Johnson was the head of the technology authority at the National Disability Insurance Agency, which runs the NDIS, when automation was introduced to support planning.
With increasing numbers of people engaging with the scheme and significant cuts made to its human resources, automation was introduced to manage volume, the global expert in digital governance says.
“Robodebt and RoboNDIS were created at the same time by the same people,” she says.
“These algorithms are used to automate processing or for administrative convenience, without any reference to the underlying lawfulness, ethics and risk of harm.”
“What Robodebt showed us is that often the very people affected by automated systems are living precarious lives with little resources or energy to spare to navigate or fight these systems.”
Kaili Metani is a mother of three; both her daughters have rare neurological disorders. “For many families, the NDIS has been a very complex scheme to negotiate,” she says.
“Every interaction I’ve had with the NDIS, especially in relation to [her younger daughter] Alia, has been negative and it’s just this constant battle.”
While people with different neurological conditions need different strategies and support, “they all get lumped into one category. What it means is that they fall into the too-hard basket,” she says.
For a long time, she just accepted this because she didn’t have the mental or emotional capacity to fight it, she says. “The solidarity of knowing other people have undergone difficult battles gave me the strength to fight.”
Both Robodebt and RoboNDIS were rolled out without proper safeguards; it’s no surprise “these systems can feel alienating, harsh, and dehumanising,” Dr van Toorn says.
“For many, interactions with the NDIA have become experiences of trauma rather than support.
“That’s not to say the intentions aren’t good … but the systems are not operating in ways that are informed by the lived experiences of the people using these services.”
There is very little recourse for people to challenge the decisions made, she says.
“People can seek a review of their plan if they feel it’s inadequate, but they can’t challenge the way their needs are datafied, how information about them and their disability is collected and used to inform their support plan.”
It’s hard for all of us to fully comprehend the ways our lives are shaped by automation and AI, says Dr van Toorn. “These systems operate largely out of sight, and public understanding of these technologies remains limited.”
“People often assume that algorithms and AI are simply neutral tools when in fact technologies are always embedded with societal values.”
“What I’m interested in is what people with disability envision for their future and what role, if any, technology plays in those imagined futures.”
Automation could be used to collect data on successful interventions and support, and to share this more broadly, Ms Zarmairian says. But to do this we need to be collecting the right data.
For some, technology is a “liberator”, Ms Johnson says. “It does have a role in the NDIS, but ethics and co-design are fundamental to this, and both are missing in action.”
Ms McGrath believes “there is definitely space for technology in the future but the technology I’m referring to is assistive technology”.
Assistive technologies can help improve the quality of life for people living with disability. Leveraging this for greater independence and opportunities for learning, communication and participation in line with individual goals.
But personalisation remains key. “I’m not a barcode. I’m not a number. I’m an individual with goals and thoughts and feelings,” Ms McGrath says. “At the end of the day, we are all just one bad day away from a disability and I honestly wish that people understood that more.”