Ofcom Dismisses OnlyFans’ Complaint Over BBC Investigation into Child Abuse Images
The UK’s broadcasting regulator, Ofcom, has dismissed a complaint lodged by OnlyFans regarding a BBC investigation that highlighted allegations of…
The UK’s broadcasting regulator, Ofcom, has dismissed a complaint lodged by OnlyFans regarding a BBC investigation that highlighted allegations of child abuse images originating from the platform. OnlyFans, known for hosting adult content, argued that the BBC’s reporting was unfair. However, Ofcom found the complaint unsubstantiated and upheld the integrity of the BBC’s coverage.
The controversy began with a 2022 BBC report that featured allegations from a senior US investigator. The investigator claimed to have discovered ten images of child abuse on other platforms, which bore watermarks indicating they had originated from OnlyFans. These images, reportedly created within the preceding six months, depicted pre-pubescent children being directed to produce abusive material.
The allegations were aired on the BBC program Newsnight and included in the documentary “OnlyFans Uncovered” on iPlayer. To protect ongoing investigations, the investigator’s identity and specific account details were not disclosed. This lack of detail formed the crux of OnlyFans’ complaint, as the platform claimed it was unable to verify or refute the allegations due to insufficient information.
OnlyFans contended that the absence of specific data, such as account handles or URLs, prevented it from determining whether the images had been posted on its platform and, if so, how quickly they had been removed. The platform argued that this lack of information denied it a meaningful right of reply and left viewers with a misleading impression of its safety measures.
In its written decision, Ofcom concluded that OnlyFans had been given adequate information to understand the nature of the allegations and had been afforded a reasonable opportunity to respond. Ofcom also dismissed a related complaint by OnlyFans concerning an article on the BBC News website, deeming the article impartial and fair.
This ruling comes amid increasing scrutiny of OnlyFans’ content moderation practices. Previous BBC investigations in 2021 revealed that minors had sold and appeared in videos on the platform and that its moderators sometimes issued multiple warnings before shutting down accounts posting illegal content. Following these revelations, OnlyFans claimed to have implemented stringent measures to improve age and identity verification, asserting that it had transformed into the “safest social media site in the world.”
Keily Blair, then chief strategy officer and now CEO of OnlyFans, emphasized the company’s substantial investment in enhancing its safety protocols. Despite these efforts, concerns persist. Earlier this month, Ofcom launched an investigation into whether OnlyFans is doing enough to prevent underage users from accessing pornographic material. This probe will also examine whether OnlyFans provided complete and accurate information in response to the regulator’s requests.
In response to Ofcom’s investigation, OnlyFans cited a technical issue, stating it had experienced a “coding configuration issue” affecting some age thresholds on the site. A spokeswoman for the company insisted that these thresholds were always set above 18 and that the error had been proactively corrected in reports to Ofcom.
Ofcom’s ongoing oversight includes monitoring compliance with national laws and regulations aimed at protecting children from harmful content. Since November 2020, the regulator has been responsible for video-sharing platforms with a UK presence, such as OnlyFans, Twitch, TikTok, Snapchat, and Vimeo. This role involves ensuring that these platforms implement robust measures to prevent minors from accessing inappropriate content.
As the investigation proceeds, the outcome could significantly impact OnlyFans’ operational policies and broader industry practices regarding online safety and content moderation. The findings may set precedents for how digital platforms handle allegations of illegal content and their responsibilities in ensuring user safety.
This development highlights the complex challenges faced by content platforms in balancing user-generated content with robust safety measures, while also navigating regulatory landscapes and maintaining public trust.