Digital Duty of Care: Tackling Online Child Exploitation

The Australian Centre to Counter Child Exploitation reported a significant increase in online child sexual abuse, prompting calls for stricter regulations. eSafety Commissioner Julie Inman Grant emphasizes the need for a legally mandated Digital Duty of Care. Transparency reports reveal gaps in tech firms' responses, highlighting the necessity for improved safety measures.


Devdiscourse News Desk | Hobart | Updated: 05-02-2026 09:06 IST | Created: 05-02-2026 09:06 IST
Digital Duty of Care: Tackling Online Child Exploitation
  • Country:
  • Australia

The Australian Centre to Counter Child Exploitation recorded nearly 83,000 reports of online child sexual abuse in the 2024–25 financial year, a 41% increase from the previous year. To combat this, eSafety Commissioner Julie Inman Grant is enforcing transparency notices from major tech companies like Google, Apple, and Meta to improve reporting and safety measures.

The newest report uncovers progress in detecting online abuse but highlights persistent gaps jeopardizing user safety. Positive strides include SnapChat reducing moderation time and Microsoft's enhanced detection. However, Apple and Discord's lack of proactive measures, especially regarding encrypted environments, remain a concern.

A proposed Digital Duty of Care could legally oblige tech firms to ensure their systems are safe by design, focusing on preventive measures rather than reactive reporting. Safety initiatives could deter offenders and link users to support services, stressing that safety should be integral to platform design, not an optional feature.

Give Feedback