Facebook’s move to restrict advertising targeted at underage users, after Australian researchers discovered the social media giant profiling teens based on age-inappropriate interests, underscores the need for external oversight and regulation.
Reset Australia revealed in April that Facebook was profiling underage users on their interest in smoking, gambling, extreme weight loss, and alcohol, and then selling access to the profiles to advertisers for direct, targeted advertising. The report, which was replicated internationally, can be found here.
Overnight Facebook, which also owns Instagram, announced a suite of new safety measures, including barring advertisers from targeting users aged under 18 on anything other than their age, location, and gender.
“Facebook was making money from allowing advertisers to target teenagers based on age-inappropriate interests, such as alcohol, gambling, extreme weight loss, and smoking,” said Chris Cooper, executive director at Reset Australia, the local affiliate of a global initiative working to counter digital threats to democracy across the world.
“Reset Australia’s research found Facebook’s own system profiled young people and also approved a series of dubious ads, which included targeting teenagers with cocktail recipes, gambling games, vaping, and extreme weight loss content.
“Facebook isn’t saying it will stop profiling kids based on dubious interests, just that it will not let advertisers target them based on them. There is no commitment Facebook itself won’t keep using this profiling for its own purposes.
“This just underscores the need for meaningful public oversight about how these platforms collect and use young people’s data. Big tech needs regulation so that it can operate in a way that meets public standards, we shouldn’t keep letting it make its own rules.”
Mr Cooper said while it appeared Facebook was being proactive about children’s safety online, by introducing the advertising restrictions, as well as making Instagram private for those under 16 and cracking down on unwanted contact, in reality much of it was in response to incoming international legislation. The UK’s Age Appropriate Design Code and Ireland’s Fundamentals for a Child-Oriented Approach to Data Processing are coming into effect this year, which demand stricter controls around how young people are treated online.
“In some jurisdictions Instagram profiles will be automatically private until the user is 18, while in others it’s only up until they’re 16. What’s the difference between an 18-year-old in the UK, and a 16-year-old in Australia that singles them out for such treatment?
“Facebook isn’t being consistent with these changes rather it is picking and choosing which children it will and won’t protect.”
Ireland’s Fundamentals for a Child-Oriented Approach to Data Processing also calls for an end to profiling children for advertising full stop.
“This means that Irish children won’t be targeted for ads based on their gender but Australian kids will.”
Reset Australia is advocating for Australia to introduce a similar data code for children as part of the federal government’s privacy review. If adopted, this would see social media giants and technology compelled to be designed with the best interests of children in mind. It would include great data controls for young people and limit targeted commercial advertising, and make sure that Australia’s children are afforded the same, if not better, protections.
“We shouldn’t rely on Facebook to self-regulate or other countries to dictate standards – Australia needs a regulatory code governing how children and young people’s data is collected and used. “
For more on a data code for design profile, see here: https://au.reset.tech/campaigns/a-data-code-for-children/