While the proposed social media delay aims to protect children, the panel made clear that the issue is far more complex. From AI risks and privacy concerns to mental health, digital footprints and youth activism, the conversation at UNSW Law & Justice’s Legal Hour underscored the need for nuanced, informed and child-centred policy.

At a recent Legal Hour panel hosted by UNSW Law & Justice, leading experts in law, policing and child protection gathered to discuss the federal government’s proposed legislation to delay children’s access to social media. The event, titled “Playground or Hunting Ground: Kids on Social Media,” explored the multifaceted implications of the proposed ban, including emerging threats from artificial intelligence, privacy concerns, addictive design and broader societal issues.

Professor Michael Salter, Director of Childlight UNSW, opened the session by framing the issue as one of the defining questions of our time. “We have a generation of people growing up who are digital natives who have been online and in a world that's online since they were born. I think the social media ban has really brought to light a whole set of questions both about the benefits of social media and the internet and its possibilities and also its potential harms.”

Offenders will adapt quickly

Venessa Ninovic, Senior Intelligence Analyst with NSW Police, warned that offenders are highly adaptive. “A lot of cyber criminals start off their engagement on social media and straight away divert that conversation to encrypted platforms. Why is that? Because it's a lot harder for law enforcement to detect them and their activity and its difficult for their actions to be stopped on encrypted messaging apps.”

She predicted that a ban would push offenders toward encrypted platforms, phishing via email and even AI-driven deepfake audio. “E-mails are still accessible by children. So, sending a phishing link or sending a video or asking making a request to a child via email and pretending to be their schoolteacher – they get really creative and there's other avenues that they will use.”

Mental health and domestic violence concerns

Ninovic also raised concerns about unintended consequences. “New South Wales Police Force, the data shows that over the last five years, there's been a 42% increase in mental health incidents by children under 18. And I feel like with this ban, when you add addiction, the peer pressure to be on it, and puberty, that may spike.” 

She highlighted a troubling rise in child-to-parent domestic violence, often triggered by parents taking away phones. “It brings lot of shame and guilt for parents to call the police. At what point do you call the police and have this matter kind of addressed?”

Legal and developmental complexities

Carolyn Jones, Principal Solicitor (Harm Practice) at Youth Law Australia, spoke about the legal challenges children face when engaging in online behaviour. “Children are sending nudes to somebody who's of a similar age when both are over the age of consent and it's something they want to do. The law says that they can't do it and so then they're coming into contact with laws and processes.”

Jones emphasised the importance of Youth Law Australia’s early, anonymous legal support. “Our live chat service lets kids ask questions without fear. Many are both experiencing or using violence, especially when bullying and group chats are involved.”

Chatbots and deepfakes

Both Jones and Ninovic highlighted the emerging role of AI in online harm. “We’re seeing kids use generative AI to create nudes, sometimes of themselves, sometimes of others,” Jones said. “And they’re having sexualised conversations with bots that pretend to be adults. The law hasn’t caught up.”

Deepfakes are also becoming a tool for cyberbullying. “Adults struggle to identify deepfakes—how can a 10-year-old?” Ninovic asked. “It skews their perception of reality and can lead to serious emotional harm.”

AI chatbots pose another risk. “Children may turn to chatbots instead of parents for support,” Ninovic warned. “These bots agree with everything, never challenge them and never ground them. That’s dangerous.”

‘Legislative whac-a-mole’

Associate Professor Katharine Kemp addressed the growing market for age verification technologies. “These systems may use biometric data, facial recognition and content analysis. That affects not just children’s privacy, but everyone’s.”

She also noted the risk of inaccurate age estimation and data misuse. “Even with legislation, companies often fall foul on data protection. Sometimes it’s error, sometimes it’s deliberate.” 

Kemp emphasised the need for balance. “Privacy doesn’t trump all. But we need to ask whether the harms from this ban are out of proportion to the benefits. By rushing the ban, the government ignored warnings from psychologists, suicide prevention networks, privacy experts and children.”

She also warned we may see a spike in workarounds to dodge restrictions. “It's relevant that recently in the UK where we start to see age verification come in that we saw this big spike in downloads of VPNs virtual private networks of 1,800% suddenly in response to that,” she said. “Then you are in danger of this legislative whac-a-mole where you're constantly having to move your target as the same harm sprouts up somewhere else.”

Content regret and deepfakes

Children are increasingly aware of how past posts can affect their future. “We’ve had kids ask for help removing content they posted at 13 that now, at 16, they realise was offensive,” Jones said. “It doesn’t meet the threshold for eSafety intervention, but it matters to them.”

Ninovic added that it only takes five good images to create high-quality deepfakes. “When you have people like Mumfluencers or just parents uploading photos of their children, uploading photos of them in their school uniform, at the soccer oval every Sunday, in front of their home, predators and criminals can piece all this information together. And I think it's important for parents to set an example for children as to what being safe online is.”

Youth activism and missed voices

The panel emphasised that social media is also a tool for youth activism. “Kids have used it to organise protests and movements and have their generation’s voice heard,” Salter said. 

Jones agreed. “They could use social media in many ways, but that's one of the advantageous ways that they would miss out on, and that as a society we also miss out on with their voices absent.”

Watch the full recording