Written by Dawn Lo

It's already happening – but there is a big difference between predicting judgement and exercising judgement and it's clear which one AI is better at.

The Federal Court of Australia ruled this year that an “inventor can be non-human”, meaning that an AI system can own a patent. This landmark decision was made shortly after South Africa became the first country to approve AI as an inventor in July 2021. 

Unlike the UK and the US, which refused to recognise AI systems as patent owners, Australia seems quite pro-AI. So, does this mean that AI can play a bigger role in the courtroom? Are we ready for an AI judge?

Countries such as Estonia have already established an AI judge in a move to streamline government services and clear a backlog of cases for judges.

“The Estonian government used an AI judge to adjudicate small claim disputes such as contract claims under €7,000,” says Professor Michael Legg, who has a long history of research in the impact of technology on litigation and dispute resolution.

Applying an AI system to process small claims is efficient as they do not involve an exercise of discretion.

Similarly in Canada, AI has been used in some areas of the law such as strata property disputes and motor vehicle claims below a certain amount. In British Columbia, the Civil Resolutions Tribunal (CRT) uses a form of AI called an ‘expert system’.

“The CRT uses the ‘Solution Explorer’ which is an expert system or advanced decision tree that guides a user through the elements of a claim. It helps a person assess whether they have a claim, brings together the information needed to make a claim, and facilitates online communications to try and resolve the claim,” Prof. Legg says.

“Another form of AI called machine learning can also employ algorithms to review large numbers of judgements or other data to find relationships, such as a particular injury receives compensation in a certain dollar range. In divorce proceedings, the property is divided in a certain manner.”

But despite the increasing use of technology in legal matters, there is still no AI judge operating in Australia.

Can AI be a fairer judge and avoid human biases?

“There are arguments on both sides of the ledger in that AI can overcome or avoid human biases by taking a data-driven approach. But also, that AI itself can produce biased or inappropriate outcomes or outputs, for a host of different reasons,” says Dr Felicity Bell, Research Fellow for the Law Society of NSW's Future of Law and Innovation in the Profession (FLIP) research stream.

“The main thing is that people – lawyers in particular – need to be aware of the limitations and biases of both AI and human decision-making," Dr Felicity Bell says. Image: Shutterstock


One way of overcoming human biases is through predictive analytics.

Prof. Legg says a key benefit of predictive analytics is that they can find correlations that the human brain cannot find by harnessing enormous amounts of computing power and data. 

“However, if data is incomplete or biased then the prediction can be inaccurate,” he says.

“While a judge may also have incomplete information and unknown biases, protections – such as open justice including providing reasons for a decision and procedural fairness – allow for transparency in decision-making and therefore, challenge or critique. The human judge can also seek to weigh the incommensurables and exercise compassion.”

What are the risks of AI taking on a bigger role in court?

Legal experts say the main concerns with AI playing a more central role in court is that it impinges on the fundamental requirements of justice such as open justice, procedural fairness, and impartiality.

“A black box form of AI should never be permitted in the court system because it fails to comply with those core concerns. There is a debate about ethics in AI. But for the justice system, there is no debate,” Prof. Legg says.

“If the AI cannot be tested to make sure it operates in a fair and transparent manner then it cannot be used. No ifs or buts trying to argue for efficiency or cost saving. Reducing cost and delay is important but not at the expense of the core requirements of justice.”

Dr Bell says the problem is less about the use of AI in court but rather about the use of AI in pre-court or pre-action situations.

“If people must go through some type of AI screening to triage their case prior to coming to court, maybe AI would evaluate their chance of success. Is this an appropriate way of managing court workloads or is it impermissibly cutting people off from accessing justice? Decisions about systems like this will have to be made.”

Can we expect to see an AI judge in the near future?

Small claim matters are obviously only one part of the court process so the need for human judges remains for more complex cases, says Dr Bell.  

“For matters that involve some element of judicial discretion, automation will remain difficult. Judges also remain subject to a wide range of accountability mechanisms which AI lacks.”

It is currently very challenging to automate most litigations as AI does not have the emotional intelligence to determine what is reasonable and what is misleading.

Another issue is that current legislation is written in English, but some legislation can be written in such a way that it can be processed by computers, says Professor Lyria Bennett Moses, Director of the Allens Hub for Technology, Law and Innovation.

“It is much easier for AI systems to give answers to legal questions where the laws are written in a language that computers can understand. If we start with rules written in computer code, then they can be executed by a computer automatically.”

But while a judge position is not yet viable for AI, technology can still be used to streamline court processes to increase efficiency and reduce costs.

AI can assist judges by providing a triage function and assist in decision-support systems. Photo: Shutterstock.


“AI can provide guidance in assessing damages, incorporating predictive models to estimate the costs associated with an injury or the value of a lost opportunity, for example,” Prof. Bennett Moses says.

“Technology can also assist a party in completing court forms. It can suggest negotiation or mediation. It may even evaluate options for resolution or suggest options. The aim of the technology is to assist parties to resolve disputes themselves and reduce the cases that need a judge,” Prof Legg says.

“As AI becomes more advanced, and assuming a business case (someone will pay for the product), we may see AI screening motions and suggesting an outcome or providing advanced legal research that can identify the most similar precedent.”

But it is important to note that any future AI judge will need strict limits around it, Prof. Bennett Moses says.

“At the end of the day, there is a massive difference between judgement and prediction. What AI systems can do is predict how judges might act, they don't exercise judgement. They can predict outcomes, but they can't exercise judgement.

“What people want when they go to court and appear before a judge, is they want judgement exercised in their case. They don't want to be given an answer that is automatically based on other similar cases. That's not what they're looking for.”

AI can help with the admin, it can support judicial decision-making, but ultimately – no known AI techniques can replace a judge, Prof. Bennett Moses says.


Prof. Legg and Dr Bell are launching a new book about AI and the legal profession on 1 November 2021. They are also speaking at the Law Society’s FLIP conference on 13 October 2021. Prof. Bennett Moses will be presenting the limitations of AI in the legal system at The Australian Academy of Science on 7 October 2021.

Image caption (top): Applying an AI system to process small claims is efficient as they do not involve an exercise of discretion. Image: Shutterstock.