The European Union plans to crack down on 'addictive design' features on major social media platforms like TikTok and Instagram to better protect minors. EU Commission President Ursula von der Leyen announced these intentions during the European Summit on Artificial Intelligence and Children in Denmark.
Key Areas of EU Scrutiny
President von der Leyen specified several design elements that will face regulatory action later this year. These concerns focus on features that keep users engaged, potentially to the detriment of younger users:
- Endless Scrolling: The continuous feed mechanism.
- Autoplay Features: Content that plays automatically.
- Push Notifications: Alerts designed to prompt user interaction.
She stated that these measures apply to both TikTok and Meta's platforms, noting that Instagram and Facebook are failing to enforce their minimum age requirement of 13 years old.
Combating Harmful Content Exposure
The EU is also launching investigations into platforms that allow children to access harmful content streams. Specifically, the Commission is looking into instances where minors might encounter material promoting:
- Eating disorders.
- Self-harm.
New Age Verification Technology
To bolster child safety online, the EU's executive arm has developed its own age verification application. According to von der Leyen, this tool boasts the "highest privacy standards in the world." Member states will soon be able to integrate this system into their digital wallets, allowing for easier enforcement by online platforms.
Previous Investigations
Separately, the EU Commission initiated an investigation earlier this year concerning Elon Musk's platform, X (formerly Twitter). This probe specifically addresses the spread of sexually explicit, non-consensual content involving women and children, which was reportedly generated by X's chatbot, Grok.