European Union lawmakers are calling for personal liability on tech executives if their platforms fail to protect children online. This proposal, set for a vote in the European Parliament’s internal market committee, targets leaders like Mark Zuckerberg amid ongoing probes into companies such as Meta and TikTok under the Digital Services Act.
What the Proposal Means for Tech Leaders
The draft report suggests holding executives personally accountable for serious and repeated violations of child protection rules. It focuses on addictive platform designs and exposure to harmful content that could affect kids’ mental health.
Lawmakers argue this step would push companies to take stronger actions. For instance, it could mean fines or legal consequences for bosses if platforms ignore age verification or fail to block violent material.
This idea comes from Hungarian lawmaker Dora David, a former Meta employee. She believes current laws lack teeth without direct accountability for top leaders.
The vote happens soon, but the proposal is not binding yet. It aims to influence the European Commission to update the Digital Services Act with tougher enforcement.
Experts say this could set a new standard, making executives think twice about lax safety measures.
Ongoing Investigations Under the Digital Services Act
The European Commission is probing several platforms for possible breaches related to minor protection. These investigations started in recent years and continue into 2025.
Meta faces scrutiny over how its algorithms might push addictive content to young users. TikTok is under fire for ad databases that could expose kids to scams or harmful ads.
Other platforms like YouTube and Snapchat have received requests for information on their age checks and recommender systems. The Commission wants proof they prevent risks from illegal products or violent videos.
Here is a quick look at key ongoing cases:
- Meta: Investigated for child safety on Facebook and Instagram, focusing on addictive features.
- TikTok: Charged with breaking ad rules, risking fines up to 6 percent of global turnover.
- YouTube: Probed on measures to protect minors from harmful content recommendations.
- Snapchat: Asked to detail safeguards against exposing kids to dangers in app stores.
These probes show the EU’s push for better compliance since the Digital Services Act took effect in 2023.
Regulators have already issued preliminary findings against TikTok for failing to detect fake ads and threats during elections. Fines could reach billions if violations are confirmed.
Reactions from Tech Companies and Experts
Tech giants have mixed responses to the liability push. Meta has rolled out new tools like stricter parental controls and age-appropriate content filters in Europe.
TikTok claims it complies by adding features such as ad labeling and bans on targeting minors with personal data. Yet critics say these steps fall short without real accountability.
Industry experts warn that personal liability could chill innovation. One analyst noted it might lead to overcautious designs that limit free expression.
On the other hand, child advocacy groups praise the move. They point to studies showing rising anxiety among teens from social media use.
A recent report from a European think tank highlighted how platforms profit from engagement that harms kids. It called for executives to face consequences similar to those in finance for compliance failures.
Potential Impacts on the Tech Industry
If adopted, this could reshape how tech firms operate in Europe. Companies might invest more in AI driven safety tools to avoid executive risks.
It could also inspire similar laws elsewhere. For example, the United States is debating bills like the Kids Online Safety Act, which echoes EU concerns.
Economically, fines under the Digital Services Act have already hit platforms hard. TikTok’s parent company faced a potential 6 percent turnover penalty in a 2025 ruling.
| Platform | Key Violation Alleged | Potential Fine Percentage | Status as of October 2025 |
|---|---|---|---|
| Meta | Addictive algorithms for minors | Up to 6% of global turnover | Ongoing investigation |
| TikTok | Breaches in ad transparency | Up to 6% of global turnover | Preliminary charges issued |
| YouTube | Inadequate child protections | Up to 6% of global turnover | Information requests sent |
| Snapchat | Risks in app recommendations | Up to 6% of global turnover | Under scrutiny |
This table outlines the stakes, based on recent Commission actions.
Smaller firms worry they lack resources to meet these standards, potentially leading to market consolidation.
Broader Context and Future Outlook
Europe’s focus on child safety ties into global trends. In 2025, Australia passed laws mandating age verification on social media, while the UK strengthened its Online Safety Act.
The EU’s actions build on a 2024 court ruling that favored TikTok and Meta in a fee dispute but upheld core Digital Services Act rules.
Looking ahead, the Commission plans more guidelines on protecting minors from online dangers. This includes better detection of hybrid threats like disinformation aimed at youth.
Parents and educators hope these changes reduce screen time harms. Recent data shows European teens spend over six hours daily online, linked to higher depression rates.
As debates continue, share your thoughts on tech accountability in the comments below. What do you think about holding executives personally liable? Your input could spark important discussions.








