- The Online Safety Act imposes new legal obligations to protect minors and adults online.
- Ofcom is the regulatory body with the power to impose sanctions and monitor compliance.
- Mandatory age controls are being introduced on websites with sensitive content, along with rapid reporting measures.

The way we use the Internet is undergoing a radical change in the United Kingdom thanks to the entry into force of a new law: the Online Safety Act. This groundbreaking regulation, which places a special focus on the protection of minors, requires platforms, social networks, and search engines to implement technical, legal, and organizational measures to safeguard users from illegal and harmful content.
If you're wondering what exactly this law entails, how it will impact the user's online experience, what changes it introduces, and what risks or benefits it brings, here's the most comprehensive analysis. The Online Safety Act is a turning point in the British digital ecosystem, with repercussions that are already being replicated in other countries.
What is the Online Safety Act and why is it so important?
The Online Safety Act was born from the desire to make the network safer, especially for younger people, But it will affect all users and operating platforms in the United Kingdom. Essentially, it's a legislative package that imposes various obligations on websites, apps, and online services that allow users to share or consume content.
Its main objective is force technology companies, forums, social networks, video sites, search engines, and instant messaging to remove (and prevent the appearance of) illegal or harmful content. The law also aims to ensure that online experience of minors be healthier, more transparent and less exposed to psychological harm, harassment, pornography or hate speech.
The person in charge of supervising compliance and imposing sanctions is Ofcom, the British media regulator, which now has enhanced powers to investigate, audit, and even block access to problematic services. And it doesn't just affect companies based in the UK: Any website or app accessible and relevant to British users falls within the scope of the regulation.
Who is affected by the Online Safety Act?
The scope of the Online Safety Act is much broader than it might seem: it covers all platforms or services where users can share, upload or interact with content. We are talking about:
- Social networks (Facebook, X, Instagram, TikTok and similar)
- Video and streaming portals such as YouTube or Twitch
- Forums, instant messaging apps, and group chats
- Dating sites and dating services
- Cloud file storage and sharing systems
- Search engines and content aggregators (such as Google, Bing, or DuckDuckGo)
- Multiplayer online gaming platforms
- Pornography and adult content sites
- Even blogs and small spaces allow comments or interaction between users.
It doesn't matter if the company is based in another country: If you have users in the UK, if the service can be used from there, or if Ofcom considers there to be a tangible risk to British people, you must comply with the obligations. Furthermore, all terms of service, legal notices, and procedures for reporting or complaining are required to be in accordance with the terms of service. clearly accessible and adapted to minors when necessary.
Main obligations for online platforms and services
Tech companies, both large and small, have new duties that must be fulfilled depending on the size, risk and nature of your service:
- Assess the risks that users (especially children) may be exposed to illegal or harmful content.
- Prevent the appearance of illegal content (e.g., child pornography, hate speech, extreme violence, promotion of suicide, or the sale of weapons and drugs), and quickly remove them if detected.
- Establish effective mechanisms for users to report illegal content, harassment, abuse, or failures in protection or moderation, and act upon complaints.
- Implement procedures to address complaints and provide repairs in the event of inappropriate actions, such as the erroneous deletion of legitimate content.
- Designing websites and apps with security in mind, opting for safer default settings for minors and systems that make it difficult for problematic material to go viral.
- Publish transparently strategies, technologies and processes used to comply with legal obligations, as well as codes of good practice and proactive measures.
- In certain cases, provide tools for adults to personalize their experience and may decide to avoid content from anonymous users or not view certain categories of messages, even if they are legal.
- Record and save all documentation related to your compliance procedures and the decisions you make on safety matter.
Child Protection: Shielding Against Harmful Content
The Online Safety Act devotes its highest priority to children's safety online. Platforms, apps, and websites that may be used by minors must implement systems that effectively prevent access to content such as:
- Pornography and sexually explicit material
- Content that encourages suicide, self-harm, or eating disorders
- Violent, humiliating, misogynistic material, dangerous challenges and bullying
- Incitement of hatred based on race, religion, sexual orientation, gender identity or disability
- Bullying, hate campaigns and any other form of digital abuse
- Content that encourages minors to ingest, inhale, or expose themselves to harmful substances
From July 25, 2025, truly effective age assurance systems are mandatory. Checkbox controls or questions without REAL verification are no longer valid. Methods accepted by Ofcom may include biometric checks, online document verification (ID, passport, or driver's license), bank/mobile phone validation, facial analysis, or "digital identity wallets" for adults, among other approved systems. Furthermore, these controls must be inclusive and not exclude more vulnerable groups.
Platforms are also required to inform parents and minors in a simple and clear manner about the risks, available protection tools, website policies, and ways to report problems.
New criminal offenses and sanctioning regime
The Online Safety Act creates new, specific criminal offenses and toughens prosecutions for online threats and hate speech. Some notable examples:
- “Cyberflashing”: non-consensual sending of sexual photos (genitals), including via instant messaging apps.
- Spread of pornographic deepfakes: Creating or sharing fake, realistic-looking images or videos to humiliate, harass, or damage another person's reputation.
- Sending false information with the intention of causing psychological or physical harm (beyond jokes or irony, intent or gross negligence must be demonstrated).
- Threats: Sending messages that include threats of death, sexual violence or serious injury, whether by text, voice or images.
- Trolling people with epilepsy: Intentional dissemination of flash sequences to produce attacks.
- Encouraging or assisting self-harm or suicide.
Penalties range from fines, blocking access to the websites and apps involved, to imprisonment for executives and managers if they fail to comply with specific requirements or cover up incidents. Ofcom can order banks, advertisers, or ISPs to stop providing services to websites that violate the law, thereby blocking their revenue and access. Users can also take legal action if they feel their rights have been violated or their complaints have been ignored.
How does the Online Safety Act affect businesses, administrators, and moderators?
The most radical change is the leap from “goodwill self-regulation” to direct legal liability: If you run a forum, have a commenting site, or run an online community relevant to British users, you are now responsible for ensuring that your space does not become a source of predictable harm.
You must document your procedures, allocate resources to handling complaints, address claims, and modify your website or app architecture to comply with Ofcom requirements. This implies:
- Program and update rapid removal systems for prohibited content
- Monitor the spread of suspicious materials (including through artificial intelligence)
- Strengthen access controls and configure parental control tools
- Provide communication and support channels for parents and those affected
- Designate internal managers identifiable to Ofcom and users
- Record all relevant decisions and changes
What are the penalties and consequences of breaking the law?
Fines can reach £18 million or 10% of the company's global turnover, whichever is greater. Furthermore, executives can be prosecuted if they withhold information from Ofcom or prevent inspections. In serious cases, a judge can order a complete blockage of the service from the UK and the cessation of relations with banks, advertisers, and internet providers.
Websites should refrain from encouraging users to use VPNs or other methods to bypass age controls, as this will be considered aggravating. Following the implementation of mandatory verification on porn sites, thousands of Britons began downloading VPNs to circumvent these barriers, prompting active scrutiny from the regulator.
Online Safety Act: Criticism, controversy, and public debate
Not everyone agrees with this law. Some parents' and victims' associations believe the regulations should be even stricter and want minors under 16 to be banned from social media. Meanwhile, groups specializing in digital privacy and freedom of expression warn of serious risks:
- Age checks can be overly intrusive and increase exposure to identity theft or security breaches.
- There are fears that the requirement to monitor messages and files will lead to an erosion of end-to-end encryption, opening the door to mass surveillance.
- The high cost of compliance can force small forums or independent websites to close, leaving the space solely in the hands of large multinationals.
- False positives occur where adults are restricted from accessing legitimate content (e.g., alcohol support forums or mental health discussions) simply out of fear of being "mistakenly blocked."
There is also criticism from international organizations, which warn of the danger of granting the government excessive powers over the regulation of online content, with few mechanisms for parliamentary oversight.
Editor specialized in technology and internet issues with more than ten years of experience in different digital media. I have worked as an editor and content creator for e-commerce, communication, online marketing and advertising companies. I have also written on economics, finance and other sectors websites. My work is also my passion. Now, through my articles in Tecnobits, I try to explore all the news and new opportunities that the world of technology offers us every day to improve our lives.