Facebook and Instagram face EU scrutiny over harmful content

The European Union is investigating Facebook and Instagram for potential violations of the EU's Digital Services Act (DSA). The investigation focuses on the protection of children online, specifically the use of algorithms that may exploit children's vulnerabilities and promote addictive behavior. The EU also questions the effectiveness of Meta's age verification tools and compliance with DSA rules regarding minors' privacy and safety. Meta acknowledges the challenge and highlights its efforts to protect children online. The DSA investigations prioritize child protection, with earlier inquiries into TikTok and ongoing probes into other platforms like X and AliExpress. Violations could result in substantial fines.


PTI | London | Updated: 16-05-2024 17:17 IST | Created: 16-05-2024 17:17 IST
Facebook and Instagram face EU scrutiny over harmful content
  • Country:
  • United Kingdom

The European Union opened fresh investigations Thursday into Facebook and Instagram over suspicions that they're failing to protect children online, in violation of the bloc's strict digital regulations for social media platforms.

It's the latest round of scrutiny for parent company Meta Platforms under the 27-nation EU's Digital Services Act, a sweeping set of regulations that took effect last year with the goal of cleaning up online platforms and protecting internet users.

The European Commission, the bloc's executive arm, said it's concerned that the algorithmic systems used by Facebook and Instagram to recommend content like videos and posts could "exploit the weaknesses and inexperience" of children and stimulate "addictive behaviour''. It's worried that these systems could reinforce the so-called "rabbit hole" effect that leads users to increasingly disturbing content.

The commission is also looking into Meta's use of age verification tools to prevent children from accessing Facebook or Instagram, or be shown inappropriate content. Only children aged 13 and older are allowed to use the platforms. It's also looking into whether the company is complying with DSA rules requiring a high level of privacy, safety and security for minors.

"We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,'' Meta said in a prepared statement. ''This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission." They're the latest DSA cases to focus on child protection under the DSA, which requires platforms to put in place stringent measures to protect minors. The commission opened two separate investigations earlier this year into TikTok over concerns about risks to kids.

"We are not convinced that Meta has done enough to comply with the DSA obligations — to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram," European Commissioner Thierry Breton said in a social media post.

The cases announced Thursday aren't the first for Facebook and Instagram. They're are already being investigated under the DSA over concerns they're not doing enough to stop foreign disinformation ahead of EU elections next month.

Social media platform X and e-commerce site AliExpress are also being investigated over their compliance with the EU rules.

There's no deadline for the investigations to wrap up. Violations could result in fines of up to 6% of a company's annual worldwide revenue.

(This story has not been edited by Devdiscourse staff and is auto-generated from a syndicated feed.)

Give Feedback