When parents separate in Australia, their futures - and their children's - often rely on the words chosen by judges in the Family Court. But those words aren't always neutral along gender lines, say a team of UNSW researchers.
The language chosen by judges in the Australian family courts reflects and reinforces long-standing gender biases, according to a recent study .
A team of UNSW Sydney data scientists and legal experts used artificial intelligence (AI) to analyse more than 2500 custody and parenting judgments across 4330 documents, handed down in the Australian family courts between 2001 and 2021. They looked at the way male and female judges spoke to, and about, male and female parties in relation to their capacity to care for their children.
"We used AI to uncover the biases we knew existed in the legal system but remained 'hidden'," says the study's lead author Associate Professor Mehera San Roque from the UNSW School of Law, Society & Criminology.
The study applied a method called structural topic modelling - a statistical technique that groups words into clusters of themes.
The results show judgments reflect gender stereotypes - with mothers seen mainly as caregivers and fathers judged on their ability to financially provide. While trends indicate the latter gender gap is narrowing, the former isn't - fathers were also praised for even a limited involvement in childcare, whereas mothers' efforts were expected.
A/Prof. San Roque says this pattern echoes decades of feminist legal scholarship, which applies a more traditional method of analysing the content of legal decisions case-by-case.
"AI helped us take this beyond a manual analysis of decision-making to identify trends at a system-wide scale," A/Prof. San Roque says.
"We found elements of stereotypical bias, with certain parties being talked about in different ways. For example, we see a conflation between financial capacity and actual capacity, especially when talking about fathers - and that replicates biases identified in earlier feminist work," she says.
"Another example - again, consistent with previous work - was when judges looked at whether a father has capacity to care for his child, he often received the benefit of his new partner's capacity. Where the court was satisfied the father could provide care for his children 'with the assistance of his present wife'."

Systems designed to repeat bias
The researchers focused on family court cases because they usually involve one male-identifying and one female-identifying parent - with both parents parties to the litigation.
This makes family law judgments a useful setting to study gender bias in this comparative way. Criminal cases, by contrast, involve the State, defendants and witnesses, rather than two competing parties.
The data reflected differences between the way presiding judges spoke. Female judges gave greater weight to the risks of harm to children and paid closer attention to the role of fathers in co-parenting. They also spoke more to financial matters than their male counterparts did.
Male judges, meanwhile, focused more on court procedures and processes, especially in relation to female parties. This included compliance with court orders, which was the most significant topic overall.
Across the two decades studied, male judges outnumbered female judges roughly two to one.
That imbalance, says A/Prof. San Roque, reflects the enduring legacy of exclusion of female judges from the profession. Women were barred from practicing law in Australia until the early 20th century - and the effects linger with a 'masculinist' leaning legal profession.
"The legal system is a precedent-based system, which means it's replicating biases because of its structure," she says. "In common law systems like Australia's, legal texts don't just reflect the law - they actively shape the frameworks for how future cases are decided."
The study notes other minority groups - including social and racial - also face a combination of formal and informal barriers to legal careers. While this has an impact on the diversity of the profession, no diversity data currently exists beyond gender.
"So there's the ongoing question about judicial diversity and whether having a broad and more diverse bench that reflects social demographics makes a difference to decision making," A/Prof. San Roque says.
"What our study shows is that gender diversity does make a difference to how certain topics and ideas are thought of."
For coauthor Professor Yanan Fan, from the UNSW School of Mathematics & Statistics and UNSW Data Science Hub, the project also offers a reflection on the role of AI itself.
"We know there's bias in AI too," Prof. Fan says.
"A lot of AI systems just troll through existing documents and make their recommendations based on what they see," she says.
"And if the existing documents have bias - like you see in this study - then they'll be perpetuating that bias."
An example could be asking a large language model for a picture of a nurse - it'll give you a woman. And if you ask for a picture of a doctor - it'll give you a man.
"The perpetuation of bias with AI is similar to how bias is repeated within the legal system," Prof. Fan says.
The perpetuation of bias with AI is similar to how bias is repeated within the legal system.
Troubling trends
A/Prof. San Roque credits Prof. Fan's earlier work for inspiration on how AI can be harnessed to highlight system-wide biases.
Prof. Fan had looked at university student evaluations of their lecturers. Applying AI across a large dataset of student feedback forms, Prof. Fan uncovered what staff had already been highlighting as issues - but with anecdotal evidence that was not yet statistically proven.
"There were ways in which male lecturers were being evaluated when compared to female lecturers," Prof. Fan says.
"Female lecturers were talked about in terms of their approachability, how caring they were and how nice they were," she says. "Males were talked about in terms of how well they knew their subject and how authoritative they were."
A/Prof. San Roque says there are echoes of that study's findings when looking at the language used in the family courts.
"It's totally the same in terms of what gets valued for each gender," says A/Prof. San Roque.
She says there is more to interrogate from the 20-year dataset.
While this study focused primarily on the capacity for care, A/Prof. San Roque says the same tools could shed light on another critical issue in family law: violence.
She says early findings suggest a pattern in language that minimises domestic violence.
Working alongside linguists, A/Prof. San Roque says across many judgments, violence is spoken about in a way that obscures the perpetrator and softens the impact.
"There's a particular use of the passive voice when talking about violence," she says.
Although they've yet to run a large-scale analysis of violence-related decisions, A/Prof. San Roque says doing so could help expose how legal language shapes, and sometimes downplays, the experiences of vulnerable parties.
"Because we can't manually read through thousands of judgments, AI lets us uncover system-wide patterns," A/Prof. San Roque says.
"Words matter. So it's important that we do pick up these trends."
This project was funded under the UNSW Science Social Good Seed Fund and LexisNexis Australia and was undertaken by Elma Akand, Yanan Fan, Wayne Wobcke, Scott A. Sisson and Mehera San Roque.