Children in Australia are being pervasively tracked when they go online without proper regard for their privacy or the risks of collecting their data, according to technology and privacy experts.

Driven by a multibillion-dollar children’s advertising market, private companies are harvesting personal information to build psychographic profiles and data sets — usually for use in targeted messaging — in an opaque online advertising industry.

In some jurisdictions, data protection laws and regulators have attempted to rein in the practice with new standards and large fines for breaches. But Australia, which critics say lacks robust privacy protections generally, is falling further behind because of an absence of enforceable protections for children’s privacy or effective avenues for recourse when breaches occur.

“Our research estimates that by the time a child reaches the age of 12, more than 72 million pieces of personal data have been collected about them by advertising technology designed for adults,” says Dylan Collins, CEO of Super Awesome, a “kidtech” advertising company.

“This data is being captured on children who have no understanding about what they’re giving up or even that the practice is happening.” 

Collins tells Which-50 the data is typically being collected for advertising and product recommendations — an exchange most adults do not fully understand.

But it is also increasingly being used for automated decisions, according to Collins.

“That can have a life-changing impact. For example, [it could] affect a credit score, school admission or job eligibility.” 

Governments and regulators around the world appear to agree the risks easily outweigh most of the benefits children receive when they exchange their information for the use of online services like social media.

Dylan Collins, CEO of Super Awesome. Image:

Early this year the UK privacy watchdog unveiled a strict new code of practices for online service providers like Google and Facebook, requiring them to provide a “built-in baseline of data protection” for younger users, including turning off tracking and targeted ads by default and only allowing data collection essential to the services children are actively and knowingly engaged in.

The code builds on Europe’s existing data regulations (known as GDPR) and its specific provisions for kids, commonly known as GDPR-K, which already limits data collection on young people and otherwise requires informed, unambiguous consent from parents.

The platform giants and digital advertising groups fought to water down the new UK code with limited success, arguing that stopping data collection could prevent them from delivering the services altogether.

But UK regulators were clear: online services must be designed and operated with the best interests of children in mind.

“In a generation from now, we will look back and find it astonishing that online services weren’t always designed with children in mind,” UK Information Commissioner Elizabeth Denham said in January.

“There is no doubt that change is needed. The code is an important and significant part of that change.”

There are similar specific restrictions on data collection on children in the US and China already, with comparable initiatives underway in India, Brazil, South Korea, and Argentina, according to Collins.

He says these laws have helped form a global standard for the explicit protection of children online. This includes preventing or minimising profiling, third-party tracking, and geolocation sharing, and requiring publishers to obtain verifiable, informed and explicit consent before any information is collected.

The Australian government is now reviewing its online safety legislation but is “looking for industry” to apply high default privacy settings in children’s online services. However, in its latest discussion paper, it notes that it is also considering new powers for the eSafety Commissioner to enforce the default privacy settings.

Further privacy reforms should also come from the government’s ongoing response to the digital platforms inquiry. But the proposed changes are subject to legislation reviews and inquiries — a process expected to take years.

“We believe Australia is falling behind world powers in the protection of kids’ data privacy,” Collins says.

“Whilst we applaud the government’s comprehensive proposals to improve the safety and wellbeing of children online, we believe the current Online Safety Act does not fully address harms occurring on more recently developed services and platforms amongst children under the age of 16 years.”

The kidtech CEO says the reforms should be widened to consider legally requiring service providers to act in the best interests of children in designing their services, implementing a higher standard of consent, and eliminating the “leakage” of personal data through better technology.   


Without such rules digital platforms and data firms may have little incentive to clean up the practice — the children’s advertising market is already worth $1.7 billion and growing 20 per cent each year as more kids come online, according to research from PwC.

And even when the rules are broken, the consequences for the biggest players are modest.

YouTube, for example, had to pay a $US170 million fine last year as part of its settlement with US regulators over the collection and use of children’s data in behavioural advertising — something explicitly prohibited under US law. 

Critics described the fine as “paltry” — YouTube pulls in over $US15 billion in ad revenue a year — and denounced the regulator’s redaction of figures showing how much YouTube made from the practice. 

David Vaile, Chair of the Australian Privacy Foundation, says companies that have built entire business models on data collection — like Google and fellow digital advertising giant Facebook — have shown little regard for users’ privacy, including children.

David Vaile, Chair of the Australian Privacy Foundation. Image:

“The big data firms — the commercial surveillance businesses — want the more data the better. [They are] absolutely addicted to growth. They’re almost at the end of it because they’ve almost got everybody. So kids are sort of like the next frontier. 

“Their algorithms are so omnivorous or out of control that they don’t mind that [children] couldn’t buy a car or something like that because they might still have some influence, or maybe they get them in the future.”

Vaile argues the bargain struck between children and these companies — personal information in exchange for the use of online services or content — is fundamentally unfair because younger people aren’t able to identify potential consequences as easily. 

Indeed, Vaile says, even adults struggle to understand the harms and risks that can manifest from data collection. 

“One of the concerns with children is by the time they get to the stage where they can enter [agreements] as adults they‘be got this full psychographic profiling data sets probably held by two or three or four different entities about them.

“So they’re ripe for the manipulation and they have been already.”

Dr Joanne Orlando, a specialist in digital literacy at Western Sydney University, agrees few people understand the bargain they strike when they surrender personal information online — much less children.

Dr Joanne Orlando, a digital literacy researcher and consultant. Image:

“We all don’t really know the implications of the collection of so much data. We know we get more advertising in our feed and things like that. But that’s all we kind of see, we don’t really know what’s happening behind the scenes.”

Orlando tells Which-50 that even when it is clear how data is being used, the practice can be concerning.

“We’ve heard of things like [children] will be using an app and location data will be collected so when a child walks past that particular shop they’ll get some kind of notification saying 25 per cent off food or something in the shop. So that kind of nudge strategies to get them to buy or get them to respond in some way. 

“So it’s manipulation really, and we lose a bit of control over decision making.”

In 2017, leaked internal documents revealed just how far Facebook was taking the practice.

The social media giant was targeting and exploiting some of its youngest users, offering Australia’s biggest advertisers an ability to target teenagers who were feeling “worthless” and “insecure”.

Facebook claimed to know these vulnerable moments based on the data it had collected on users, according to reports.

When revealed by The Australian, Facebook apologised — but refused to say if the practice was in place in other markets. It also insisted the data was collected “consistent with applicable privacy and legal protections, including the removal of any personally identifiable information, our internal process sets a standard higher than required by law.”

Alarmingly, the social media giant may be right.


Australia’s legal and regulatory framework for privacy is notoriously weak, relying on implied consent, vague definitions of personal information, and lacking effective recourse avenues when privacy is breached, according to Vaile.

“We don’t have a right to sue for breach of privacy,” he says.

“In Australia, unlike in most other [comparable] countries including the US, trying to sue for breach of privacy just goes nowhere because it doesn’t exist. It’s been recommended in five separate Law Reform Commission reports over the last 30 years.”

Vaile says establishing that legal right would be the simplest and most effective way to address many of the problems with online privacy. More obligations on digital platforms to protect privacy, as has been recommended by the consumer regulator, and a strengthening of consent requirements would also be welcomed by Vaile.

However, he notes that any reforms to protect children must be well considered, as they are vulnerable to manipulation and overreach by governments.

Dr Orlando, who also runs a TechClever — an organisation that helps parents, teachers, and schools navigate technology — adds that regulation is only part of the solution, and all stakeholders need to work together on a solution that can be adapted as technology changes.

“If it’s the companies, the people regulating this or the parents, I think the idea is that it’s not a set-and-forget kind of approach. 

“The online space changes so quickly that whatever strategies we put into place — whether it’s strategies in their home with their family [or] what the government’s doing or that kind of thing — it needs to be ongoing and regularly revisited: modified, changed or completely revamped because that is just the space.

“If we want to be protective in a meaningful way, a real way, that is just the state of play with the online world.”

Collins’ company shows there may also be a market for an alternative to the highly targeted advertising offered by the digital giants.

SuperAwesome offers kids brands a safe, compliant digital ecosystem that works in a “zero data” internet. The company has raised $US58 million, including backing from Microsoft, and claims to have most major children’s brands on board.

“There is simply no need to capture personal data on children,” Collins says.

“The kidtech sector has emerged as a new category of technology built specifically with privacy and responsibility in mind, enabling completely safe engagement with children. Kids should be able to enjoy games and videos without their personal data being harvested and shared with hundreds of companies.”

Previous post

Commonwealth Bank and Wesfarmers ink $400 million sustainability-linked loan

Next post

James Cook University partners with Optus Business on IoT Innovation