The UK communications regulator, Ofcom, has unveiled a new draft guidance designed to help firms meet their legal obligations under the Online Safety Act (OSA) to protect women and girls from online harms. This guidance, published recently, aims to tackle issues such as harassment, bullying, misogyny, and intimate image abuse that disproportionately affect females on digital platforms. The initiative emphasizes a "safety by design" approach, compelling platforms and services to integrate security measures from the onset.
The draft guidance outlines strategies across four key areas: online misogyny, pile-ons and harassment, domestic abuse, and intimate image abuse. It recommends practical steps such as removing geolocation features by default, conducting "abusability" testing, enhancing account security, and using hash matching technology to curb intimate image abuse. There is also a focus on addressing the rapid rise of deepfake technology, which has fueled a significant increase in non-consensual intimate image abuse.
Jessica Smith, a spokesperson for Ofcom, underscored the importance of compliance with UK law for platforms operating in the country. She stated:
“Platforms that are operating in the UK will have to comply with the UK law.”
The Online Safety Act mandates strict adherence to illegal harm prevention and child protection duties. Non-compliance could result in penalties of up to 10% of a company's global annual turnover. The first set of duties under the OSA is set to come into force next month, with Ofcom prepared to enforce core responsibilities even before the guidance becomes fully enforceable.
In developing the draft guidance, Ofcom collaborated with victims, survivors, women's advocacy groups, and safety experts. This collaboration aimed to ensure the guidance is comprehensive and practical. Jessica Smith highlighted the proactive approach expected from service providers:
“We think that there are sensible things that services could do at the design phase which would help to address the risk of some of those harms.”
The draft includes specific recommendations such as:
“Removing geolocation by default (to shrink privacy/stalking risks); conducting ‘abusability’ testing to identify how a service could be weaponized/misused; designing in user prompts that are intended to make posters think twice before posting abusive content; and offering accessible reporting tools that let users report issues.”
The new guidance comes at a critical time as Ofcom has gathered substantial evidence on the effectiveness of hash matching tools and the troubling rise of AI-generated deepfake images. These advancements necessitate urgent measures to safeguard vulnerable groups online.
Ofcom plans to update its earlier codes with these new guidelines shortly. The current draft will be open for consultation until May 23, 2025, after which final guidance will be released by the end of the year. Smith emphasized the ongoing nature of this initiative:
“Once we finalize the guidance, we will produce a [market] report … about who is using the guidance, who is following what steps, what kind of outcomes they’re achieving for their users who are women and girls, and really shine a light on what protections are in place on different platforms so that users can make informed choices about where they spend their time online.”
The goal is not only to enhance user safety but also to help companies avoid reputational damage by providing actionable steps to improve user experiences. Smith noted:
“There’s still a lot of work to do across the industry.”
Ofcom's proactive stance aims to encourage immediate action from platforms even before the finalization of the guidelines. Transparency powers will play a crucial role in ensuring accountability:
“I think this is where our transparency powers also come in — if the industry is changing direction and harms are increasing, this is where we will be able to shine a light and share relevant information with UK users, with media, with parliamentarians.”
Leave a Reply