LOS ANGELES (December 16, 2024)— Stating that it is “time for tech firms to act,” Ofcom, the UK’s online safety regulator, has published its first codes of practice and guidance for platforms that publish pornographic material and host user-generated content.
Firing the starting gun on new duties for tech firms under the Online Safety Act, Ofcom has issued new rules for tackling illegal harms such as terror, hate, fraud, child sexual abuse, and assisting or encouraging suicide, four months ahead of the statutory deadline, bringing the UK’s online safety regulations into force.
Providers now have three months to complete their illegal harm risk assessments and reveal their compliance plans for accommodating the more than 40 safety measures platforms will be required to introduce by the March deadline. These include new safety duties on social media firms, search engines, messaging, gaming and dating apps, and pornography and file-sharing sites.
According to Ofcom, people in the UK will now be better protected from illegal harms online, as tech firms are legally required to take action to tackle criminal activity on their platforms and make them safer by design.
The Online Safety Act lists more than 130 “priority offenses” that tech firms must assess and mitigate the risk of occurring on their platforms.
Among its regulations, Ofcom expects each provider to name a senior management person who will be held personally accountable for their platform’s compliance. It also demands platforms implement better moderation, easier reporting, and built-in safety tests to remove illegal materials as quickly as they are found.
High-risk providers are required to use automated tools such as hash-matching and URL detection to identify child sexual abuse material (CSAM), the scope of which has been expanded to capture smaller file hosting and file storage services, which pose an exceptionally high risk of being used to distribute CSAM.
Ofcom notes that while it will support providers in complying with these new duties, it is simultaneously gearing up to take early enforcement action against any platforms that ultimately fail.
“For too long, sites and apps have been unregulated, unaccountable, and unwilling to prioritize people’s safety over profits. That changes from today,” declared Ofcom’s Chief Executive, Dame Melanie Dawes. “The safety spotlight is firmly on tech firms, and it’s time for them to act. We’ll be watching the industry closely to ensure firms [meet] the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”
“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them,” Dawes added.
Those powers include fining companies up to £18m ($18,916,200) or 10 percent of their qualifying worldwide revenue — whichever is greater — and, in severe cases, obtaining a court order to block a site from access in the UK.
The regulator also notes that “2025 will be a year of change,” as this is just the beginning of setting up an enforceable regime as a firm foundation on which to build. Further requirements are expected in the spring as it evaluates proposals, including AI and more advanced forms of hash-matching to prevent sharing non-consensual intimate or terroristic imagery.
Any technology Ofcom requires a provider to use will need to be accredited by the regulator or its appointee to meet the minimum standards of accuracy established by the government. These standards are under review, with public comments welcome before March 10, 2025.
The complete statement from Ofcom can be found here.
Executive Director of the Association of Sites Advocating Child Protection (ASACP), Tim Henning, who continues to consult with Ofcom on behalf of industry stakeholders, describes the UK regulator’s initiatives as “a reasonable and thoughtful approach to online child protection.”
“Unlike some ‘child protection’ proposals in the U.S. and elsewhere that are thinly veiled attempts at the outright prohibition of pornography, the UK is leaving the door open for responsible platforms and providers to serve the needs of consenting adults,” Henning explained. “By taking an active role in the process, the association can offer guidance on the protocols and technologies that are most suited to this noble task and clarify their limitations and opportunities for improvement.”
“We hope that implementing an economically feasible and successful approach to online child protection by the UK will inspire other jurisdictions to find more balanced ways of meeting this universal need,” Henning concluded. “Enabled by the support of our sponsors, ASACP will advocate for the industry to produce a workable outcome.”
To learn more about how your business can help protect itself by protecting children, email tim@asacp.org.
About ASACP
Founded in 1996, ASACP is a nonprofit organization dedicated to online child protection. ASACP comprises the Association of Sites Advocating Child Protection and the ASACP Foundation. ASACP is a 501(c)(4) social welfare organization that manages a membership program that provides resources to companies to help them protect minors online. The ASACP Foundation is a 501(c)3 charitable organization responsible for the CP Reporting Tipline and RTA (Restricted To Adults) website meta-labeling system.
ASACP has invested 28 years in developing progressive programs to protect minors, and its assistance to the digital media industry’s child protection efforts is unparalleled. For more information, visit ASACP.org.