Digital Duty of Care: Tackling Online Child Exploitation
The Australian Centre to Counter Child Exploitation reported a significant increase in online child sexual abuse, prompting calls for stricter regulations. eSafety Commissioner Julie Inman Grant emphasizes the need for a legally mandated Digital Duty of Care. Transparency reports reveal gaps in tech firms' responses, highlighting the necessity for improved safety measures.
- Country:
- Australia
The Australian Centre to Counter Child Exploitation recorded nearly 83,000 reports of online child sexual abuse in the 2024–25 financial year, a 41% increase from the previous year. To combat this, eSafety Commissioner Julie Inman Grant is enforcing transparency notices from major tech companies like Google, Apple, and Meta to improve reporting and safety measures.
The newest report uncovers progress in detecting online abuse but highlights persistent gaps jeopardizing user safety. Positive strides include SnapChat reducing moderation time and Microsoft's enhanced detection. However, Apple and Discord's lack of proactive measures, especially regarding encrypted environments, remain a concern.
A proposed Digital Duty of Care could legally oblige tech firms to ensure their systems are safe by design, focusing on preventive measures rather than reactive reporting. Safety initiatives could deter offenders and link users to support services, stressing that safety should be integral to platform design, not an optional feature.
(With inputs from agencies.)
ALSO READ
eThekwini Deploys Over 1,000 Officers, Ramps Up Safety Measures for Easter Holiday Surge
Call for Transparency in Ganderbal Encounter
Secrecy Concerns: Transparency Lapses in FCRA Regulation Unveiled
Record-Breaking Advance Pricing Agreements Propel Taxation Transparency
Delhi Steps Up Food Safety Measures Amid Festive Rush

