UK Regulators Demand Stricter Age Checks on Social Media Platforms to Protect Children

UK regulators are urging social media companies to implement stronger age-verification measures to prevent children from accessing potentially harmful content. Platforms like Facebook, Instagram, and TikTok have been instructed to demonstrate improvements by April 30. Ofcom and the ICO emphasize using modern technology to ensure children’s online safety.


Devdiscourse News Desk | Updated: 12-03-2026 13:15 IST | Created: 12-03-2026 13:15 IST
UK Regulators Demand Stricter Age Checks on Social Media Platforms to Protect Children
This image is AI-generated and does not depict any real-life event or location. It is a fictional representation created for illustrative purposes only.

The UK's media and privacy regulators have issued a stern warning to social media giants, urging them to enhance age-verification processes to protect children. The regulators, Ofcom and the Information Commissioner's Office (ICO), argue that current age checks are insufficient.

In light of the proposed changes, social media platforms such as Facebook, Instagram, TikTok, and YouTube have been given until April 30 to showcase how they plan to tighten these measures. The regulators emphasize the importance of using modern technology to ensure the safety of young users.

The UK's Online Safety Act, currently being implemented, aims to reinforce these changes. Failure to comply could result in significant fines for social media companies. The ICO recently fined Reddit for inadequate age verification practices, underscoring the seriousness of these concerns.

(With inputs from agencies.)

Give Feedback