[ad_1]
Last Updated:
The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.
EU investigates Facebook, Instagram for addictive behavior in kids. Concerns over DSA compliance and algorithm effects on mental health
The European Commission on Thursday opened a probe into Facebook and Instagram on suspicion the platforms owned by Meta are causing addictive behaviour in children.
“We are not convinced that it has done enough to comply with the DSA (Digital Services Act) obligations to mitigate the risks of negative effects to the physical and mental health of young Europeans,” the EU’s internal market commissioner, Thierry Breton, said of Meta. The EU said it is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.
We’ve opened formal proceedings against Meta regarding the protection of minors on Facebook and Instagram.The systems of both platforms, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called ‘rabbit-hole effects’.
More ↓
— European Commission (@EU_Commission) May 16, 2024
‘Safety for young online users’
“Today we are taking another step to ensure safety for young online users. With the Digital Services Act we established rules that can protect minors when they interact online,” said Margrethe Vestager, Executive Vice-President for a Europe Fit for the Digital Age. “We have concerns that Facebook and Instagram may stimulate behavioural addiction and that the methods of age verification that Meta has put in place on their services is not adequate and will now carry on an in-depth investigation. We want to protect young people’s mental and physical health,” she added.
EU’s opening of proceedings is based on a preliminary analysis of the risk assessment report sent by Meta in September 2023, Meta’s replies to the Commission’s formal requests for information (on the protection of minors and the methodology of the risk assessment), publicly available reports as well as the Commission’s own analysis.
The current proceedings address the following areas:
- Meta’s compliance with DSA obligations on assessment and mitigation of risks caused by the design of Facebook’s and Instagram’s online interfaces, which may exploit the weaknesses and inexperience of minors and cause addictive behaviour, and/or reinforce so-called ‘rabbit hole’ effect. Such an assessment is required to counter potential risks for the exercise of the fundamental right to the physical and mental well-being of children as well as to the respect of their rights.
- Meta’s compliance with DSA requirements in relation to the mitigation measures to prevent access by minors to inappropriate content, notably age-verification tools used by Meta, which may not be reasonable, proportionate and effective.
- Meta’s compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors, particularly with regard to default privacy settings for minors as part of the design and functioning of their recommender systems.
Next Steps
The Commission will now carry out an in-depth investigation as a matter of priority and will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections. The opening of formal proceedings empowers the Commission to take further enforcement steps, such as adopting interim measures and non-compliance decisions.
[ad_2]
Source link