European Union (EU) regulators have confirmed another investigation against Meta over concerns the social media giant has potentially breached online content rules on child safety.
As part of the Digital Services Act (DSA) which took effect last year in the European bloc, companies are compelled to adhere to act on harmful content or face potential, substantial fines.
Specifically, Facebook and Instagram are being probed to determine if they are having “negative effects” on the “physical and mental health” of children.
On Thursday (May 16), the European Commission confirmed it had opened formal proceedings, with the EU executive body also wary that Meta is also not doing enough on age assurance and verification methods.
“The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’,” said the statement.
🚨 Today we open formal #DSA investigation against #Meta.
We are not convinced that Meta has done enough to comply with the DSA obligations — to mitigate the risks of negative effects to the physical and mental health of young Europeans on its platforms Facebook and Instagram. pic.twitter.com/WxPwgE5Opc
— Thierry Breton (@ThierryBreton) May 16, 2024
EU challenges tech industry to comply with DSA
Several big tech firms have been targeted by the EU for potential breaches of the DSA (DSA), which threatens financial punishment of up to 6% of annual global turnover.
Meta, which also owns WhatsApp and Threads, insists it has “spent a decade developing more than 50 tools and policies” to protect children. “This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission,” added a company spokesperson.
The ‘rabbit-hole effect’ alluded to above, refers to how algorithms work on modern social media apps, with a user viewing one piece of content leading on to another of a similar nature. This can become a pattern over an extended session of scrolling or from repeated suggestions of content to watch.
In the UK, regulators are also closely monitoring how the technology works with the UK communications watchdog Ofcom warning algorithms pushing harmful content are causing concern.
The body is preparing to enforce the Online Safety Act, as it revealed many young children are using social media accounts, sometimes with parental knowledge, despite the minimum user age being set at 13.
Image credit: Ideogram