Why online safety matters to the mental health and wellbeing of young people in Australia

08 Feb 2022

The internet and social media can have a very positive impact on young people's lives. For many young people it reduces isolation, increases connection, is an outlet for creativity, and can have a positive impact on mental health and wellbeing. On the flip side however, learning to navigate the online world can be challenging and it’s important that young people have the information, education and support they need to navigate the internet safely and independently, and care for their mental health and wellbeing at the same time. 

At ReachOut we believe that all young people deserve to feel safe online which is why we provide young people, parents and carers, and schools with the tools and information they need to navigate social media platforms in ways that promote safety, connection and wellbeing. And, while the internet can be a positive influence, it can also have serious, negative impacts on young people’s lives. 

In Australia, teens spend 14.4 hours online each week and, encouragingly, 9 in 10 teens engage in at least one type of positive online behaviour such as posting positive comments, being inclusive and supporting friends. Conversely however, social media can be harmful to young people’s mental health and wellbeing with negative behaviours or experiences online including bullying, exposure to body image and self harm content, unwanted contact from strangers, receiving inappropriate material, reputational damage and others. In 2020, the Australian Institute of Health and Welfare found that 44% of young people aged 12–17 had at least one negative online experience in the last 6 months. In addition to this, when asked about their experiences online, 90% of young people reported being a victim of bad behaviour online at some point, with nearly 60% reporting emotional or psychological impacts associated with encountering risks online. 

When pinpointing some of the risks associated with being online, personalised algorithms and recommender systems that control what content young people see when they are online can be – like social media itself – harmful and unsafe for young people. User’s data can be manipulated to increase frequency of specific content and, as such, algorithms have, at times, been known to amplify a number of risks including cyberbullying, maladaptive effects on young people’s body images such as recommending Pro-Anorexia content, as well as self-harm content. 

For example, a recent experiment in Australia found that it took TikTok’s recommender algorithm only 7 hours and 42 minutes to ‘learn’ that a child was interested in content that promoted harmful gender stereotypes and began to recommend this content at such a frequency that it would take only 5-6 days of regular use before their social media feed was completely filled with this content. 

On the flip side, personalised algorithms and recommender systems can also be used for good and allow services like ReachOut to be more nuanced in the way we reach young people, providing them with relevant information and support and connecting them to help. Algorithms help us to efficiently reach users based on their interests, or issues of concern, by recommending our content to people who may be interested in us or require our support for certain issues. 

In order to make the internet and social media safer for young people in Australia, it is important that support, education and information is thoughtfully disseminated to young people to minimise potential risks. It is equally important that parents, carers and schools are equipped and empowered to have effective conversations with young people about online safety. For instance, ReachOut Schools offers teachers a range of resources to help improve students’ understanding of online safety and we have worked in partnership with Instagram to produce a ‘Parents Guide to Instagram’ to provide parents and carers with information about the platform’s safety features. Ultimately, however, social media companies need to do more to ensure their platforms are safe for young people and it is their job to ensure platforms are inherently designed to be safe for the young people who use them and effectively regulated. 

When considering data privacy matters and the online safety of young people, operating on a ‘best interests principle’ would ensure that the positive aspects of social media practices and features are retained while the negative aspects are curtailed. This would mean that, when it comes to the collection, use or disclosure of a young person’s information, the best interests of the young person would be the primary consideration when determining what is fair and reasonable. Ultimately, this would also mean that platform features such as algorithms, recommender systems, commercial profiling and sticky features are only leveraged when likely to benefit the young person. 

As members of the Facebook Online Safety Roundtable and the Twitter Trust and Safety Council, we believe that internet providers and social media companies have a duty of care to ensure any platform they operate is safe for its users. At ReachOut we will continue to advocate for a safer internet so that young people are able to navigate the internet safely and independently, and are supported to look after their mental health and wellbeing. #SaferInternetDay #PlayitFairOnline #SID2022

For more information about ReachOut, visit ReachOut.com, ReachOut.com/Parents and ReachOut.com/Schools. You can also check out the eSafety Commissioner for more information on safe and positive online practices.

Authors