eSafety Commissioner Accuses YouTube of 'Turning a Blind Eye' to Child Abuse

Keywords: eSafety Commissioner, YouTube, child abuse, social media, online safety, Australia, Google, child exploitation, digital regulation, tech companies
Back to News List
Wednesday, 06 August 2025

eSafety Commissioner Accuses YouTube of 'Turning a Blind Eye' to Child Abuse

August 5, 2025 — Australia’s internet watchdog has raised serious concerns over the handling of child abuse material on major social media platforms, with YouTube singled out for its lack of transparency and responsiveness.


Watchdog Report Highlights Industry Failures

In a report released on Wednesday, the eSafety Commissioner, Julie Inman Grant, accused YouTube and Apple of failing to address basic questions about the number of reports of child sexual abuse material on their platforms. The report also criticized the companies for not providing details on how quickly they respond to such reports, or the number of staff dedicated to trust and safety.


The findings come amid the federal government’s decision to include YouTube in its world-first social media ban for users under the age of 16, following the commissioner’s recommendation to overturn its planned exemption for Google’s video-sharing site.


YouTube and Apple Failing to Protect Children

Commissioner Inman Grant stated that when left to their own devices, these companies are not prioritizing the protection of children and are seemingly turning a blind eye to crimes occurring on their services.


“No other consumer-facing industry would be given the licence to operate by enabling such heinous crimes against children on their premises, or services,” she said in a statement.


Google's Response and Industry Standards

Google, which owns YouTube, has previously stated that abuse material has no place on its platforms and that it uses industry-standard techniques such as hash-matching technology and artificial intelligence to identify and remove such content.


However, the eSafety Commissioner’s report noted that some providers, including Apple and Google, have not made improvements to address these safety gaps despite being put on notice in previous years.


Call for Greater Accountability

The report also highlighted a range of safety deficiencies on the platforms, including failures to detect and prevent live-streaming of abuse material, block links to known child abuse content, and inadequate reporting mechanisms.


“In the case of Apple services and Google’s YouTube, they didn’t even answer our questions about how many user reports they received about child sexual abuse on their services or details of how many trust and safety personnel Apple and Google have on-staff,” Ms Inman Grant said.


Government Stands Firm on Banning YouTube for Minors

The Australian government has made it clear that it will not be intimidated by threats from Google, which has warned it may take legal action over the decision to ban YouTube for users under 16.


Prime Minister Anthony Albanese has emphasized that protecting children online is a non-negotiable priority, and that the government is committed to enforcing the ban regardless of the pressure from tech giants.


Support Services for Victims

For those affected by child abuse or exploitation, a range of support services are available, including:

  • 1800 Respect – National counselling helpline: 1800 737 732
  • Bravehearts – counselling and support for survivors of child sexual abuse: 1800 272 831
  • Child Wise – counselling provider: 1800 991 099
  • Lifeline – 24-hour crisis support and suicide prevention: 13 11 14
  • Care Leavers Australia Network: 1800 008 774
  • PartnerSPEAK – peer support for non-offending partners: (03) 9018 7872

The eSafety Commissioner has mandated that major tech companies, including Apple, Discord, Google, Meta, Microsoft, Skype, Snap, and WhatsApp, report on the measures they take to address child exploitation and abuse material in Australia.


As the debate over online safety continues, the government’s stance on banning YouTube for under-16s has sparked a broader conversation about the responsibilities of tech companies in protecting vulnerable users.