Technology Secretary Liz Kendall has put major digital platforms on notice. They must step up efforts to protect women and girls from online abuse or risk stronger government action. In a fresh open letter, she gave platforms until the end of the year to fully implement new Ofcom guidance aimed at making the internet safer.1
This move forms part of the Labour government’s wider push to treat violence against women and girls as a national emergency. The goal is to halve these offences within a decade.
Labour Government Makes VAWG a Top Priority
The current Labour administration has placed tackling violence against women and girls at the heart of its plans. This includes using technology laws to fight misogyny that spills from online spaces into real life.
The Department for Science, Innovation and Technology is leading coordination across government. Kendall’s recent roundtable with bosses from Meta, TikTok, YouTube and Snapchat on March 9 made the message clear. Platforms cannot sit back while abuse thrives on their services.
In December 2025 the government published its cross-government strategy called Freedom from Violence and Abuse. It backs up the manifesto promise with concrete actions. These include new rules on deepfakes and stronger enforcement of the Online Safety Act.
Ofcom Guidance Targets Specific Harms Facing Women
Ofcom released its detailed guidance titled A Safer Life Online for Women and Girls in November 2025. The regulator identified four key areas where harms hit women and girls hardest.
These include misogynistic abuse and sexual violence, pile-ons and coordinated harassment, stalking plus coercive control, and image-based sexual abuse.

The guidance goes beyond basic legal duties. It pushes platforms to use safety by design principles from the start.
Key recommended steps include:
- Adding prompts that ask users to think again before posting harmful content
- Introducing limits on pile-ons to stop coordinated attacks
- Setting stronger default privacy options for accounts
- Using hash-matching technology to block known intimate images
- Making reporting tools simpler and allowing collective reports of abuse
- Removing money-making options from content that promotes sexual violence
Ofcom expects platforms to test their services properly and improve support for users who report problems. The regulator plans to publish reports on how well companies are doing so users can choose safer places online.
Deepfakes and AI Tools Face Tough New Rules
The rise of AI-generated explicit images has made the problem worse. The UK has already criminalised the creation of sexually explicit deepfakes without consent. This includes tools that create fake nude images, often called nudification apps.
These measures sit alongside the Online Safety Act and upcoming Crime and Policing Bill changes. Platforms now carry clear responsibility to stop this content spreading.
Kendall told platforms directly in her letter that they have the tools to block and delete online misogyny. Failure to act makes them part of the problem, she warned.4
Young people face particular risks. Research shows many boys encounter misogynistic content early, while girls often change their behaviour or avoid certain apps to stay safe.
The Human Cost Hits Hard Every Day
Real stories show why this pressure matters. Women in public life, journalists, and everyday users report constant harassment that affects their mental health and careers.
One in five women in the UK have faced online abuse or harassment, with much of it carrying sexist or misogynistic tones. Among younger women the numbers climb higher. Gen Z women report significant mental health impacts from repeated exposure.
Girls as young as 11 sometimes miss school to avoid sexual harassment linked to online behaviour. Many limit what they post or avoid speaking out because they fear pile-ons.
The guidance also recognises harm to boys. Algorithmic feeds can push toxic content that shapes harmful ideas about masculinity. This creates a cycle that hurts everyone.
Platforms have huge power over what millions see each day. Their algorithms can amplify hate or help reduce it. Kendall wants them to choose the latter and go above and beyond minimum legal requirements.
What Comes Next for Platforms and Users
The end-of-year deadline gives platforms time to make changes but sends a clear signal that delays will not be tolerated. Ofcom will monitor progress closely and share findings publicly.
This fits into broader work including better age checks, safer recommendation systems, and faster removal of illegal content. Government also plans further support for victims and education in schools about healthy relationships.
Experts welcome the focus but stress the need for real enforcement. Women’s groups have long called for platforms to take these issues more seriously instead of treating them as inevitable side effects of free speech.
Smaller changes like better privacy controls and easier reporting can make a big difference quickly. Larger shifts in how algorithms work will take more time but could prevent harm before it starts.
Women and girls deserve to use the internet without fear. The digital world should open doors, not close them through constant abuse.
This latest push from Kendall shows the government means business. Platforms now have clear expectations and a firm timeline. How they respond in the coming months will shape whether the online space becomes safer for everyone.
The coming year will test whether tech companies are ready to match the government’s ambition. For millions of women and girls checking their phones every day, the stakes could not be higher.
What do you think about these new demands on social media platforms? Share your views in the comments below. Have you or someone you know faced online misogyny? Your experiences matter in this important conversation.








