Aussie Kids' Data at Risk via Approved School Apps

A UNSW-led audit of nearly 200 school-endorsed apps has found most begin harvesting children's data within seconds - often contradicting their own privacy policies and exposing gaps in oversight by education systems, app developers and regulators.

Hundreds of educational apps recommended by Australian state education departments are exposing children to significant privacy and security risks - collecting sensitive data, transmitting it to third parties, and hiding behind privacy policies so complex that almost no parent can understand them.

The findings come from one of the most comprehensive audits of Australian educational apps, analysing approximately 200 Android apps sourced from school recommendation lists, state Department of Education websites, and the Google Play Store.

Funded by the UNSW Australian Human Rights Institute, the research highlights an ecosystem built on a foundation of institutional trust which is largely unwarranted.

"Our main purpose is that Australia is moving towards digital education, including from kindergarten," said researcher Rahat Masood, a cyber security expert at UNSW.

Dr Masood collaborated with fellow UNSW researchers Sicheng Jin, Jung-Sook Lee and Hye-Young (Helen) Paik.

"We want to analyse whether Australia, the federal government and education departments are well aware of the security and privacy risks."

Only one in four apps did what their privacy policies said they would. Photo: Adobe Stock

Data collection begins before a child touches the screen

The research found that 89.3 per cent of apps began transmitting data to third parties before a user had interacted with the app at all. Opening an app was enough to send device identifiers, location metadata, and other sensitive information to analytics platforms and advertising networks.

"Even if you are not interacting with the app - you just open it and that's it - it is still transferring lots of data," Dr Masood said.

"Telemetry data which mainly refers to tracker-related identifiers and used for the automatic collection and transmission of data to remote servers. Despite just opening the app and not using any educational feature, it is still transferring a lot of information that is sensitive and can actually identify your device."

In total, 83.6 per cent of apps were found to transmit persistent identifiers - unique codes that can track a device across sessions and across different apps. Almost 67.9 per cent of apps contained at least one embedded tracker or analytics tool, such as Firebase, Facebook SDK, or Unity Analytics. These tools serve no educational purpose.

"None of these are needed to actually run the educational apps," Dr Masood said.

Privacy policies no one can read - or that simply aren't true

The research also found problems with how apps communicate their data practices to parents.

Only 3 per cent of the privacy policies analysed were written at a level considered "fairly easy" to read - meaning accessible to someone with a tertiary-level education. The remaining 97 per cent required university-level literacy or higher to comprehend.

"Nobody will understand these terminologies and jargon," she said.

"Comprehension, readability, understandability - all these metrics that we analysed were all very bad."

When a parent does wade through the legal text, it often doesn't reflect what the app actually does. The study found only about one in four apps were fully consistent between their stated privacy policy and their observed behaviour during testing.

Many apps that explicitly claimed they did not collect personal data were, in practice, transmitting identifiers within seconds of being launched.

"We matched the privacy policy with the dynamic analysis - when the app is running, whether it is collecting the data and whether it is mentioned in the privacy policy or not," Masood said.

"Only one in four were matching. Some of the policies appear to have been generated using AI tools."

Specific cases illustrate the gap. One app listed in its store description as "Data Not Collected" was observed initialising Firebase analytics and transmitting persistent identifiers from the moment it first launched.

Another app stating "no ads, no tracking" was found to be sending data to Unity Analytics and Google before a user had done anything.

Even if you are not interacting with the app - you just open it and that's it - it is still transferring lots of data.

Apps branded for children are no safer

Apps marketed to young children - with names or descriptions containing words such as "Kids," "Preschool," or "ABC" - were no safer than general-audience apps, and in some cases showed worse alignment between their stated privacy commitments and actual behaviour.

The research paper described this as "the illusion of safety" - child-centric branding cultivates parental trust without providing genuine protection.

But 76 per cent of apps targeted at children showed at least one form of policy distortion, compared with 67 per cent of general educational titles.

The researchers found apps carrying child-friendly names often embedded the same advertising and analytics tools found in commercial entertainment apps - in some cases, the same tools used to track adults across the internet.

Beyond privacy, the research identified significant security concerns.

Almost 80 per cent of apps contained "hard-coded secrets" - API (Application Programming Interfaces) keys and credentials embedded directly in the app's code in a way that could be accessed by anyone who decompiled the application.

"Hard-coded secrets mean that if you configure an API, you have a password or passphrase and the API key is hard-coded within the code," Dr Masood said.

"Anyone can access it and do whatever they want with the API. It is not a good practice from a development point of view."

No one is watching the watchers

The research raises concerns about the adequacy of oversight by the group's parents and schools trusted to vet these products.

Each state's Department of Education maintains a recommended list of apps that schools draw on when making purchasing decisions. Dr Masood said schools are told these apps have been assessed through a quality assurance framework. But the research suggests that assessment falls well short of what is needed.

"They look at very high-level details and they don't download the app - they don't do the dynamic analysis, they don't go through the accessibility and readability of the privacy policies," Dr Masood said.

She said teachers are largely unaware of the risks embedded in the tools they use daily.

"Teachers don't know anything," Masood said. "They are out of resources - first of all - and they don't know about any security issues. They were just given an app to use and that's it."

Parents also delegate their trust to schools and app stores, assuming that if an app has been approved, it is safe.

What needs to change?

Dr Masood and her colleagues are developing a practical tool for parents - a "traffic light" system that would allow them to enter the name of an app and receive a clear, visual summary of its privacy and security profile, without needing to read a word of legal text.

But beyond that, the researchers argue that systemic change is needed - from developers, from app stores, and from government.

They are calling for stricter oversight of the "child-directed" app category, arguing that labels such as "Kids" or "Educational" should require a verified technical baseline, not just a content descriptor.

The research also calls for regulators to prohibit "idle telemetry" - the practice of transmitting data before a user has done anything - and to mandate privacy policies be written in plain language accessible to the average parent.

"Analysing Privacy Risks in Children's Educational Apps in Australia," was conducted by Sicheng Jin, Rahat Masood, Jung-Sook Lee, and Hye-Young (Helen) Paik at the University of New South Wales, with funding from the UNSW Human Rights Institute. It was presented at the Symposium on Usable Security and Privacy (USEC) in February 2026.


/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.