The European Commission has taken a firm stance against Meta, issuing a preliminary ruling that the company’s platforms—Facebook and Instagram—are not doing enough to prevent children under 13 from accessing them. This decision, announced on Wednesday, follows a nearly two-year investigation into Meta’s compliance with the Digital Services Act (DSA), a set of regulations designed to protect users across the EU.
Why the EU says Meta’s age verification falls short
The Commission’s investigation revealed that Meta lacks robust systems to either block underage users or identify and remove existing ones from its platforms. A key issue is the ease with which minors can bypass age restrictions by entering false birth dates during sign-up. Since the minimum age for using Facebook and Instagram is 13, this loophole allows younger users to create accounts without proper oversight. The EU argues that these gaps not only violate the DSA but also expose children to potential risks, including content exposure and privacy concerns.
Meta has defended its approach, stating that it employs a combination of automated tools and user reporting to detect and remove underage accounts. However, the Commission’s findings suggest these measures are insufficient. The preliminary ruling indicates that the current systems do not meet the DSA’s requirements for age verification and child protection.
Potential fines and the road ahead
If Meta fails to address the Commission’s concerns, the company could face fines of up to $12 billion, or roughly 6% of its global annual revenue. This figure underscores the severity of the EU’s stance on enforcing digital regulations. The preliminary decision is not yet final, but Meta has the opportunity to respond and propose corrective measures within the coming months.
The ruling also highlights broader challenges for tech platforms operating in the EU. The DSA aims to create a safer digital environment, particularly for minors, by holding companies accountable for their policies and enforcement. Meta’s case sets a precedent for how other platforms may need to adjust their age verification and content moderation systems to comply with these regulations.
What this means for users and parents
For parents and guardians, this development serves as a reminder to review privacy settings and account controls on platforms like Facebook and Instagram. While Meta’s systems are under scrutiny, families can take proactive steps, such as enabling parental controls and monitoring account activity, to better protect younger users. The EU’s ruling may prompt Meta to introduce stricter age verification methods, such as requiring government-issued IDs or advanced AI detection tools.
As the situation evolves, Meta is expected to engage with regulators to address the Commission’s concerns. The outcome of these discussions could shape the future of social media policies in Europe, influencing how other platforms approach child safety and digital compliance. For now, users and parents should stay informed about potential changes to the platforms they rely on.
AI summary
Avrupa Birliği’nin yeni dijital hizmet düzenlemesi DSA’yı ihlal eden Meta, 13 yaşın altındaki çocukların platformlara erişimini engelleyemiyor. Komisyon’un aldığı karara göre şirket, 12 milyar dolarlık cezayla karşı karşıya kalabilir.