Australia Enforces Search Engine Blur for Porn and Violence to Protect Children
Australia mandates search engine blur for harmful content

Search engines operating in Australia will be legally compelled to blur explicit pornography and graphic violent content under sweeping new online safety rules set to take effect before the new year. The regulations form part of a major push by the country's eSafety Commissioner to prevent children from stumbling upon harmful material during routine internet searches.

New Rules Target Accidental Exposure

The landmark code of practice is scheduled to come into force on 27 December 2023. It directly addresses research revealing the alarming frequency of accidental exposure to adult content among young Australians. A study conducted by the eSafety office found that approximately one in three young people first encountered pornography unintentionally before they reached the age of 13.

While 71% of young people who unintentionally saw such content reported ignoring it, many described these encounters as frequent, unavoidable, and distressing. eSafety Commissioner Julie Inman Grant emphasised the critical role search engines play as gateways. "We know that a high proportion of this accidental exposure happens through search engines as the primary gateway to harmful content," she stated.

Ms Inman Grant provided a stark example of the potential damage: "Once a child sees a sexually violent video, for instance maybe of a man aggressively choking a woman during sex, they can't cognitively process, let alone unsee that content."

How Search Engines Must Respond

The Commissioner expects the new mandate will force search platforms to operate in a manner similar to the existing 'safe search' modes offered by giants like Google and Bing. Under the code, pornographic images will be automatically blurred in search results.

Furthermore, Australians searching for information related to suicide, self-harm, and eating disorders will be automatically redirected to appropriate mental health support services and helplines. This provision builds on existing features where platforms like Google and Bing already display helplines for suicide-related queries.

"It gives me some comfort that if there is an Australian child out there thinking about taking their own life, that thanks to these codes vulnerable kids won't be sent down harmful rabbit holes or to specific information about lethal methods, but will now be directed to professionals who can help and support them," Ms Inman Grant explained.

Distinct from Social Media Age Ban

Ms Inman Grant clarified several key points about the code's operation. She confirmed it would not require Australians to create an account to use a search engine, nor would it trigger government notifications if someone searches for pornographic material.

These search engine rules are separate from the impending social media ban for under-16s, which will require tech firms to prevent younger teenagers from creating accounts on platforms like Facebook, Instagram, TikTok, and YouTube. Although that measure faces a High Court challenge, a hearing is not expected until at least 25 February 2024, meaning the ban will proceed from its planned start date.

Social media companies that fail to comply with the age verification rules risk fines of up to $49.5 million. The developments in Australia occur against a global backdrop of increasing scrutiny on tech firms' duty of care, highlighted by a recent lawsuit in the United States where OpenAI, the creator of ChatGPT, is being sued by a family alleging the chatbot encouraged their 16-year-old son to die by suicide.