Hands on the Wheel – Free Article Sept 2025

JULY–SEPT 2025 – AQ: AUSTRALIAN QUARTERLY

Hands on the Wheel: Social Media and Youth Mental Health

By: David G Baker and Dr Louise la Sala

From December, young people in Australia will have to be 16 years or older to use some social media platforms. This world-first legislation has been the subject of wide-ranging debate in parliament and the media, with public interest and advocacy from youth organisations.

There will be challenges in implementing and enforcing the legislated social media age restrictions. Yet beyond this technical challenge there remains the far greater – and largely ignored – issue of preparing young people, both before they turn 16, and after, to safely use social media as they navigate a largely unregulated environment.

Much of the conversation lately has focused on when and how young people should be permitted to access social media, often fueled by correlational data supporting that high users of social media tend to report poorer mental health outcomes. However, decades of research have shown that several complex factors contribute to rising rates of mental ill-health, self-harm, and suicide for young people. Blaming social media alone is far too simplistic. To best support youth mental health, the environments in which young people communicate and learn need to be age-appropriate and safe.

For this reason, it is online safety that must be at the centre of policy responses to social media access and the content to which young people are exposed, not just one-size-fits-all age restrictions.

Young people’s use of social media is an emotionally charged issue. So, it can be helpful to compare it with another area of public safety subject to regulation of both users and technology providers – road safety.

In 2024, there were 1,306 road deaths in Australia. Transport accidents were the third-leading cause of injury hospitalizations (2022–23) and the fourth-leading cause of injury deaths (2021–22). Young people aged 15-24 years are consistently the group most likely to be hospitalized for a transport-related injury and the second-highest to die (per capita).

In comparison, young adults (18-24 years) are the second-largest group (per capita) for mental health-related emergency department presentations; and death by suicide is the leading cause of death among 15–24-year-olds. Based on the difficult reality of these data, for young people road safety, mental health, and suicide are comparable public health issues, yet each have received very different policy and regulatory responses.

The Australian government has determined that 16 years of age is when a young person can access social media. This is the same age at which a young person can obtain a learner driving permit via a test in most parts of Australia. There are guidebooks and online courses to support young people to prepare for the test. And once they have their learner permit, they need to log up to 120 hours of supervised driving, often with a parent or trained professional to gain experience.

Then, when they graduate to driving unaccompanied, there is a probationary period as they gain further, unsupervised experience. Probationary licenses can include limits on the age or number of passengers they can carry and the power of the vehicle they drive.

The impending age restrictions on social media would see young people of the same age going from no access to social media platforms to unsupervised, unrestricted access to largely unregulated content overnight. This is the equivalent of simply giving them the keys to the family car on their 16th birthday and hoping for the best.

Under the umbrella of championing free speech, a worrying trend has crept in across many popular social media platforms where content moderation departments and functions are being reduced or entirely removed. This means that all users could be exposed to a range of potentially harmful content, with fewer reporting mechanisms and fewer built-in supports or controls available.

To transition from presumably no access to unlimited access, young people would benefit from a similar experience-based and supported entry into social media. This preparation could include digital literacy training regarding safe online interactions, social media algorithms and content recommender systems, managing online connections, safety and privacy, and how to get support.

However, before any driver picks up the keys, the vehicles they drive have also undergone rigorous testing and have been internationally determined as safe and fit-for-purpose. Even second-hand cars need a roadworthy certificate. In addition to regulating the technology, there are also a range of road laws to minimize the risk of injury or death. We’re not yet doing this with social media platforms and digital tools.

Few people today would buy a car without airbags or advanced braking, yet we routinely download social media apps without knowing what protections are in place for our data or our young people. Within a matter of minutes you can download an app, open an account, and have an endless stream of content on your phone. The Australian eSafety Commissioner regularly points out that it is time big tech had their “seatbelt moment”.

Vehicle safety is tested in Australia through the independent Australasian New Car Assessment Program. There are four key assessment areas: adult occupant protection, child occupant protection, vulnerable road user protection, and safety assist technology. Testing criteria is revised, and safety measures increased every three years. The 5-star rating is used to support people in comparing the safety of vehicles when buying a car. With these standards checked and approved, people are empowered to make choices.

There is no equivalent, independent safety system for social media platforms, limited guidance for what constitutes safe content, and no evidence-based ways for responding and protecting users. Indeed, young people are often exposed to graphic and distressing content. At times, they actively seek out this content, but more often it appears on their newsfeeds unsolicited. Reports suggest young people are exposed to mental health-related information every 39 seconds, content about suicide every 2-3 minutes and information about disordered eating every 8 minutes. When harmful content is reported, social media companies are, at best, slow to remove it from the platform, and more likely to leave it online. This inaction towards content that users find distressing means that young people do not trust the companies to keep them safe.

Further amplifying this issue are tensions between parents and their children when it comes to social media. Parents in Australia almost unanimously believe that the internet and social media expose young people to danger or risk and that technology has made parenting more difficult now than it was in previous decades. This, coupled with the fact young people are less likely to turn to a parent for help if confronted with a negative online experience, suggests that both parents and young people are not equipped to have meaningful conversations about digital literacy and online safety.

Considering the risks and potential harms experienced by young people, a star rating system would provide a basic but accessible safety guide. Within tech, this is often referred to as implementing Safety by Design principles – companies taking responsibility for preventing and addressing harms, users being given tools to manage their own safety, and clear procedures for how safety is managed and evaluated. Similar criteria to that used for vehicle safety would be applicable, assessing: user knowledge and awareness, built-in proactive protection and detection mechanisms, peer and bystander protection, and effective user reporting tools and built-in safety features.

Road vehicles manufactured in, or imported into, Australia must comply with Australian Design Rules. These rules include safety features intended to protect people both inside and around a moving vehicle. The Australian Government aims to harmonise these standards with international regulations. In the same way, standards are also needed for social media platforms that wish to operate in Australia. Online platforms need to be regulated to protect young people using social media and their communities who are indirectly impacted by social media content.

In Australia, online content and social media providers are subject to the Online Safety Act. Under this legislation the eSafety Commissioner regulates illegal, restricted, and age-inappropriate content. The Act uses industry-agreed codes as a first step, with the option of imposing standards if agreement is not reached. A similar approach is being implemented in the United Kingdom under which social media platforms must show they have processes in place to meet codes of practice. The regulator will monitor how effective these processes are and legislation includes financial sanctions and criminal action if platforms do not meet legislative requirements.

History shows that without enforcement there is often insufficient incentive to change behaviours. Rates of voluntary fitting of seatbelts by vehicle manufacturers were low until design rules requiring seatbelts to be fitted were introduced. Similarly, the wearing of seatbelts by vehicle occupants were also low until enforced.

Advanced safety technologies are often introduced as a safety package on top-end models as a marketing strategy, ‘de-specified’ from cheaper models, or only available at extra cost. An additional, easily-accessed reporting mechanism for unsafe content should be available to users. This self-reporting should be used by the platform algorithm to determine future content an individual young person sees and be independently monitored by the eSafety Commission.

Supporting young people in their online lives requires more than telling them to “be safe“. It means recognizing that for many young people social media is an integral part of their social lives. For some young people, removing social media also means taking away their primary source of communication, belonging, and help-seeking. The unintended consequences of this ban cannot be overlooked or dismissed, particularly for those from marginalized groups who rely on these platforms for connection, community, or income.

Age-appropriate and engaging digital literacy education needs to start well before they turn 16. This includes learning about how algorithms work, how online content is shared, how to think critically about influencers and the information they are consuming, and how to spot red flags within online interactions.

Empowering young people with the skills and tools they need to stay safe online is the approach we have taken at Orygen with the #chatsafe guidelines (www.orygen.org.au/chatsafe). #chatsafe supports young people to communicate safely online about self-harm and suicide. It provides practical guidance for how to reach out for help, support a peer, and navigate distressing content and resources for the key adults in a young person’s life (e.g. parents, carers, and educators) responsible for helping young people learn how to navigate social media environments. This sort of digital literacy can be developmentally-staged, context or topic specific, and embedded within school curricula. Importantly, young people need trusted adults they can turn to — people who won’t panic, shame, or dismiss their online experiences. Teachers, youth workers, sports coaches, and even GPs are all well-positioned to offer early support if they are trained to have these conversations and show interest in a young person’s digital life.

Asking questions and understanding who young people are socializing with online, the spaces that they are interacting within, and how they are spending their time will go a long way towards ensuring online safety. And when things go off track, as they inevitably will, there must be easy, fast, and youth-friendly ways to get help. This includes strengthening services like eSafety, Kids Helpline, and school-based wellbeing teams, ensuring young people know they are not alone when they experience harm online.

Conclusion

All eyes are on Australia as it implements world-first legislation, however, if we are truly committed to supporting young people and their mental health what we do next matters more. Regulation must do more than police the age of access.

Digital literacy training must empower young people to thrive online and equip parents to support their children’s preparation to access social media and engage safely online. And social media platforms must be held accountable for the environments they create and content they allow. This moment could be a turning point, but only if we take seriously the responsibility for supporting young people to be safe, critical, and reflective users of social media. Just as when we are permitting them to drive.

AUTHORS:

David G Baker has more than 17 years’ experience in policy analysis and development, with impact in youth mental health, youth justice, and social policy. David also works with researchers to develop their translational capacity to engage with policy audiences to support their contribution to evidence-informed policy.

Dr Louise La Sala is a Research Fellow at Orygen, Centre for Youth Mental Health at the University of Melbourne. Her research is focused on youth self-harm and suicide prevention, with a specific interest in the impact of social media on the mental health and wellbeing of young people. Her work has informed online safety and suicide prevention policy, and she has unique expertise in co-developing effective strategies to promote online safety and prevent self-harm and suicide among young people.