UK Regulators Demand Stricter Age Checks on Social Media Platforms to Protect Children
UK regulators are urging social media companies to implement stronger age-verification measures to prevent children from accessing potentially harmful content. Platforms like Facebook, Instagram, and TikTok have been instructed to demonstrate improvements by April 30. Ofcom and the ICO emphasize using modern technology to ensure children’s online safety.
The UK's media and privacy regulators have issued a stern warning to social media giants, urging them to enhance age-verification processes to protect children. The regulators, Ofcom and the Information Commissioner's Office (ICO), argue that current age checks are insufficient.
In light of the proposed changes, social media platforms such as Facebook, Instagram, TikTok, and YouTube have been given until April 30 to showcase how they plan to tighten these measures. The regulators emphasize the importance of using modern technology to ensure the safety of young users.
The UK's Online Safety Act, currently being implemented, aims to reinforce these changes. Failure to comply could result in significant fines for social media companies. The ICO recently fined Reddit for inadequate age verification practices, underscoring the seriousness of these concerns.
(With inputs from agencies.)
ALSO READ
UN Expert Calls for Global Action to Protect Privacy in the Digital Age
World Cup Dreams Dashed for Mexico's Luis Angel Malagon
Gulf of Mexico Drilling Rights Auction Yields $47M in High Bids
South Africa Targets Elimination of Silicosis in Non-Mining Sectors by 2030
Epic Sports Triumphs and Trials: A Spotlight on Iconic Moments

