UK Pushes for Stricter Social Media Age Checks to Protect Children
Britain's regulators urge social media giants to enforce stricter age checks to protect children online. With growing concerns over harmful content, platforms like Facebook, YouTube, and TikTok face pressure to demonstrate improved safety measures. Failure to comply could result in substantial fines from Ofcom and the ICO.
Britain's media and privacy regulators have renewed calls for major social media platforms to enhance measures that keep minors from accessing their services. This comes as concerns grow over companies not adequately enforcing minimum age rules, despite knowing the risks involved in children accessing harmful content online.
The UK is considering introducing stricter regulations akin to those in Australia, which would block under-16 users from social media platforms. Both Ofcom and the Information Commissioner's Office (ICO) have voiced worries about the exposure of minors to potentially addictive content via algorithmic feeds and are demanding that companies make children's safety a priority.
Ahead of the next phase of implementing Britain's Online Safety Act, Ofcom has warned companies like Meta-owned Facebook and Instagram, Roblox, TikTok, YouTube, and Snapchat, to prove by April 30 how they plan to enforce age checks and safeguard young users. The ICO also issued statements urging these platforms to use modern technology for age verification, highlighting there is no excuse not to do so given the tools available.
(With inputs from agencies.)
ALSO READ
UK Tightens Security on Water and Semiconductors
BPTP Wins Iconic Project Award for Innovative Commercial Development
Scientists Decode Heat Flow Mystery in Magnetic Semiconductors
Helicopter Manufacturing Boost in Maharashtra: A New Hub in Nanded
World Cup Dreams Dashed for Mexico's Luis Angel Malagon

