Howard and Harradine’s vision for the Internet is finally realised


When Senator Brian Harradine handed Prime Minister John Howard a box of pornographic tapes in 2000, he sparked censorship reforms that excluded fetishes, kink, and non-mainstream sex acts from the definition of X-rated pornography — not based on evidence, but on the personal discomfort of two deeply conservative men.

Twenty-five years later, that same moral framework is being embedded in the digital ecosystem.

Phase 2 of Australia’s online safety codes, taking effect starting 27 December 2025, will enforce age verification for an extraordinarily broad range of content: Class 1C material (pornography depicting fetishes and BDSM) and Class 2 material, which includes not only pornography but any “high-impact” depictions of nudity, simulated sexual activity, violence, drug use, and “themes” such as suicide, racism, family breakdown, death, serious illness and drug dependency.

Backed by $50 million penalties, these codes mandate age verification and filtering across search engines, social media and virtually all internet services. The architecture remains unchanged from the year 2000: moral judgments about acceptable sexuality and expression are now required to be identified and hidden by algorithms.

When regulatory frameworks embed vague, ideologically-loaded definitions without adequate safeguards, the collateral damage falls on sex workers, LGBTQIA+ communities, health educators and young people seeking information about sexuality, mental health, and social justice.

 

The ideological network behind the codes

Behind the public-facing online safety agenda operates a tightly interconnected anti-rights network spanning Australia and the United States. Groups like Collective Shout, founded by conservative Christian “pro-life feminist“ Melinda Tankard Reist, Harradine’s former political advisor, now advise the government on age verification. While Collective Shout describes itself as non-sectarian, it works in formal partnership with the US-based National Center on Sexual Exploitation (NCOSE) — a fundamentalist Christian organisation formerly known as “Morality in Media” with explicit ties to anti-LGBTQIA+ and anti-abortion advocacy.

Together, these groups pressure payment processors to financially deplatform adult content creators. In 2025, tens of thousands of games, including LGBTQIA+ titles, have been removed from Steam and Itch.io following their campaigns.

Former Collective Shout educator Daniel Principe now gives school talks on healthy masculinities and the harms of pornography, and has appeared on a panel hosted by Women’s Forum Australia — an organisation co-founded by Tankard Reist that actively campaigns against trans rights, abortion, pornography and sex work, alongside Exodus Cry, a US evangelical organisation whose co-founder campaigns against abortion and LGBTQIA+ rights. Consent educator Chanel Contos, whose work on sexual assault has been valuable, has adopted an increasingly abolitionist anti-porn stance, invoking Andrea Dworkin to argue pornography is inherently dangerous.

Both Contos and Collective Shout have promoted UN Special Rapporteur Reem Alsalem, whose opposition to trans and sex worker rights has been condemned by over 550 feminist and LGBTQIA+ organisations worldwide.

Even eSafety Commissioner Julie Inman Grant, who describes pornography as “lawful but awful”, appeared on an NCOSE podcast in 2021, with her office later admitting inadequate due diligence.

What binds these actors is not simply distaste for pornography, but a conviction that sexuality and identity require moral regulation. Through advisory roles and institutional influence, they’re shaping a digital environment where sexual expression is suspect and censorship is framed as moral duty.

 

The overcapture problem

The codes’ definitions guarantee overcapture of educational content. Class 2B material includes “realistically simulated sexual activity”, “high-impact nudity” , and — critically — “high-impact themes”. These are defined to include “social issues such as crime, suicide, drug and alcohol dependency, death, serious illness, family breakdown and racism”.

Scarlet Alliance’s submission to the draft codes warned these definitions will

overcapture advertising for in-person sex work services, sex worker organising and the sharing of health and safety education, or other sexuality and harm-reduction educational materials.

Educational content addressing LGBTQIA+ sexuality, suicide prevention, racism, drug harm reduction, or family violence, all “high-impact themes”, will be caught in filters designed to block pornography.

The recent NSW Parliamentary inquiry revealed the real problem:

current sexuality education is not meeting the needs of a significant number of young people, who turn to pornography to fill gaps in their sexuality education. This gap is particularly pronounced for LGBTQIA+ young people and young people with disability.

Yet the Department of Communications states it is “ultimately a matter for industry” how they implement the codes and address overcapture of lawful material. The regulator sets broad categories for restriction but provides no accountability mechanisms on protecting legitimate content incorrectly flagged by big tech as prohibited.

In the UK, where similar systems became operational in July 2025, BBC Verify found content about Palestine, Ukraine, and parliamentary debates caught in censorship filters, alongside trans support websites, STI information services and abortion resources.

UNESCO warns that content moderation systems “pathologise and automatically sexualise legitimate educational content”. This leads to a global chilling effect where educators and health professionals self-censor their content to avoid being removed or suppressed through the algorithm. Sexual health educators report saying things like “seggs” instead of “sex” or censoring body parts like vulva or vagina. This is not algorithmic error: it’s a structural feature of content governance designed this way.

 

The policy alternative

If Australian policy-makers genuinely want to safeguard young people, the evidence points elsewhere: comprehensive, non-stigmatising sexuality education that acknowledges pleasure, consent and diversity. This in turn requires funding health services and educational content that addresses racism, mental health, family breakdown and social issues without shame or censorship. It requires consulting affected communities in policy design, not excluding them.

Most critically, it requires distinguishing between protecting children and policing expression. The current framework does the latter while failing at the former.

Australia’s Phase 2 online safety codes revives 2000s-era moral censorship through modern technology, using definitions so broad they will inevitably silence educational content about the very issues young people most need to understand. This choice reflects not evidence, but ideology. It points to whose voices are heard in regulatory spaces, and whose are silenced. Unless the government fundamentally reconsiders this framework and ensures sex workers, LGBTQIA+ communities and affected stakeholders are central to policy design, the next generation of censorship will look disturbingly like the last — just with better branding.

 

Image: Danielle-Claude Bélanger

Mish Pony

Pony is CEO of Scarlet Alliance, Australian Sex Workers Association.

More by Mish Pony ›

Overland is a not-for-profit magazine with a proud history of supporting writers, and publishing ideas and voices often excluded from other places.

If you like this piece, or support Overland’s work in general, please subscribe or donate.


Related articles & Essays