Britain Demands Tougher Social Media Protections for Children
The UK is urging social media platforms to enhance child protection by enforcing age restrictions and using modern technology. Ofcom and the ICO are concerned about algorithmic exposure to harmful content. The government considers measures similar to Australia, with potential hefty fines for non-compliance.
Britain's media and privacy regulators have issued a stern demand to major social media platforms to intensify efforts in safeguarding children by upholding age restrictions. They warn that these companies are failing to enforce their own minimum age rules.
Amid growing concerns about children's access to potentially harmful algorithmic content on social media, the UK government is contemplating stricter measures akin to those in Australia, potentially barring under-16s from these platforms. Ofcom and the Information Commissioner's Office highlighted these issues and urged tech giants to prioritize child safety.
In the latest phase of implementing the UK's Online Safety Act, Ofcom has directed platforms like Facebook, Instagram, TikTok, and YouTube to demonstrate improvements by April 30, or risk hefty fines. Separately, the ICO emphasized the use of modern age-assurance tools to block access for those under 13.
(With inputs from agencies.)
ALSO READ
Gulf of Mexico Drilling Rights Auction Yields $47M in High Bids
South Africa Targets Elimination of Silicosis in Non-Mining Sectors by 2030
Epic Sports Triumphs and Trials: A Spotlight on Iconic Moments
Jamie Lee Curtis and Nicole Kidman Unite in 'Scarpetta': A Crime Series Set to Captivate
Breaking News: Iconic Sports Moments and Rising Stars

