Australian Tech Regulations May Hinder Teens' Access to Online Sexual Health Info

Organisations from Australia’s online industries last week submitted a final draft of new industry codes to the eSafety commissioner. These codes are aimed at protecting children from “age-inappropriate content” and, if approved, will be implemented under the Online Safety Act. The primary goal of these codes is to address young people’s access to materials such as pornography, high-impact violence, and content related to self-harm, suicide, and disordered eating.
However, there are significant concerns that these draft codes may lead to unintended negative consequences. A substantial risk exists that they could further restrict access to crucial materials concerning sex education, sexual health information, harm reduction strategies, and general health promotion. This is particularly concerning as social media platforms can serve as powerful mediums for educating teenagers and young people about sexual health. Indeed, various social media campaigns, some government-funded, actively work to combat rising rates of sexual violence and disseminate vital sexual health information.
The eSafety commissioner is currently in the process of introducing comprehensive codes of practice for the online industry, designed “to protect Australians from illegal and restricted online content”. Phase 1 of these codes, which targeted illegal content like child sexual exploitation material, came into effect last year. The current focus is on Phase 2, which aims to prevent young people from accessing content deemed “inappropriate” but not illegal. This will be achieved through mechanisms such as age-assurance technologies, as well as by filtering, de-prioritising, downranking, and suppressing content. These codes will apply broadly across operating systems, various internet services, search engines, and hardware like smartphones and tablets. Tech companies will consequently bear increased power and responsibility to remove content and suspend users, facing fines of up to US$49.5 million (approximately A$77 million) for non-compliance.
The concept of using technology to restrict online content by age is fraught with problems. The Australian government itself has indicated that age-assurance technologies are not yet sufficiently developed for widespread use, with state-of-the-art software demonstrating racial and gendered biases. Furthermore, digital platforms have a documented poor track record in governing sexual media. International human rights organisations, including the United Nations, have issued warnings that automated content moderation is increasingly being used to censor sex education and consensual sexual expression. Research indicates a tendency for many platforms to remove or suppress content related to drag queens, trans rights, sexual racism, body positivity, and sex worker safety, while simultaneously permitting health misinformation and hate speech directed at LGBTQ+ individuals. Consequently, sexual health organisations and educators already encounter significant challenges in using social media to communicate with key audiences, including LGBTQ+ communities, often facing “shadowbanning” (reduced visibility) or outright removal of their content.
Content moderation policies are already highly restrictive. To enforce these, platforms employ nudity and pornography detection software that is frequently biased towards heteronormative standards. For instance, Google’s computer vision software has historically relied on word databases that inaccurately linked terms like “bisexuality” with “pornography,” “sodomy” with “bestiality,” and “masturbation” with “self-abuse.” In response, many users have adopted “algospeak” – language designed to bypass algorithms that might flag content as inappropriate, often involving alterations like using emojis or terms such as “seggs” or “s&x” instead of “sex.”
The proposed industry codes may also inadvertently undermine the Australian government's own efforts in health promotion. The government has committed substantial funding, over A$100 million, to Our Watch, a leading organisation advocating against violence against women, and its teen-focused social media initiative, The Line. An additional A$3.5 million has been allocated to the Teach Us Consent organisation, which creates social media content for teens and young people on topics like consent, healthy relationships, pornography, and sex. Similar to concerns about a looming youth social media ban, these codes risk counteracting government initiatives aimed at reducing gender-based violence.
Social media platforms attempt to differentiate health information from general sexual content, for example, by aiming to permit nudity in contexts like childbirth, breastfeeding, medical care, or protests. However, evidence suggests that moderating these exceptions accurately is currently almost impossible. This difficulty stems from a reliance on a distinction between sex education and sexual media that is, at best, blurry. In reality, sexuality education extends beyond purely technical information about infections, sexual dysfunction, or medical care. Sexual imagery plays a vital role in sexual health promotion, as young people respond well to visual methods of communication and learning. Moreover, the importance of pleasure has long been recognised in HIV prevention, safer sex practices, and violence prevention efforts.
Therefore, industry codes should acknowledge sexual media as a potential and valuable medium for conducting sex education and promoting sexual and reproductive rights. This perspective is especially critical as governments in many countries are increasingly moving to restrict sexual information and health services, including efforts to criminalise abortion, limit access to trans health care, and prevent comprehensive sex education. In such a restrictive global context, access to online health promotion and sex education content becomes even more vital.
While the industry codes are intended to protect, they risk endangering the ability of Australians, particularly young people who may lack access to comprehensive sexuality and reproductive health information at home or school, to access essential information. To uphold sexual rights to information, privacy, and expression, the codes must shift their focus from merely incentivising platforms to detect and suppress all sexual content. Instead, they should aim to ensure non-discriminatory access and require platforms to promote material that actively supports sexual health, rights, and justice. This approach necessitates careful consideration of content in its specific context – a task that might seem time-consuming and resource-heavy for regulators and platforms but is crucial given the dire implications of widespread content suppression. It is argued that the codes should be paused until they can effectively balance the goals of protection with the fundamental rights to information.