top of page
Search
  • Writer's pictureStephen Kinsella

Our submission to Ofcom’s call for evidence on its “third phase of online safety regulation”

Updated: May 28

Clean Up The Internet last week submitted further evidence to Ofcom, setting out our recommendations for the content of its Guidance for Category One services on how to comply with the user identity verification duty.

 

Section 64 of the Online Safety Act introduces the user identity verification duty, requiring Category One platforms to offer all adult users an option to verify their identity. Section 15(9), requires the same platforms to allow users to filter out non-verified users. Section 65 requires Ofcom to produce Guidance on how to comply.

 

We have reservations about only applying these requirements to Category One platforms, because this will mean some sites associated with significant levels of misuse of fake and anonymous accounts will not be covered. For example, given Ofcom’s current proposed criteria for a platform being designated as Category One, it seems unlikely that any standalone dating platforms would be thus designated. That’s despite the ease with which it's possible to create fake profiles on dating sites being an important enabler of romance fraud. We have set out in previous submissions to Ofcom why the logic of the Act, and its ambition to make the UK online environment safe, means that user identity verification measures should be recommended in the Illegal Content Codes Of Practice. This would ensure that they can be applied on any site where there are relevant risks of misuse of anonymous and fake accounts.

 

Nonetheless, the user identity verification duty and the accompanying filters for Category One services still have the potential to be one of the more significant, and popular, changes to be driven by the Online Safety Act. These duties, properly applied and enforced, would make a significant contribution to tackling misuse of fake and anonymous accounts on some of the largest platforms. This would drive change on sites where at present fake and anonymous accounts are a significant factor in a very wide range of harms.

 

For this considerable potential to be realised, Ofcom’s guidance will need to set the right framework for platforms to follow. In the absence of regulatory supervision, platforms have an extremely poor record of mitigating harms from fake and anonymous accounts, or of implementing robust and effective verification schemes. Without sufficiently clear and robust guidance, there’s every reason to fear more of the same delay, obfuscation, platitudes and half-measures which have characterised platforms’ (in)action in this area to date.

 

Our paper therefore sets out five recommendations for Ofcom:

 

  1. The guidance must address the full range of relevant harms where fake and anonymous accounts are a risk factor

  2. The guidance must seek to raise the bar, and must not simply collate existing “best practice” which sadly is better described as “worst practice” [

  3. The guidance should align with other relevant OSA codes, and other relevant regulation, standards, and guidance

  4. The guidance must set out criteria which platforms must satisfy for an identity verification scheme to fulfil the duty

  5. The guidance should offer examples of methods and processes which will not be considered to satisfy the duty, and some which could be

 

Our full submission can be viewed here:


Clean Up The Internet submission to phase 3 call for evidence - recommendations for s65 gu
.
Download • 153KB


0 comments

Join our mailing list

Your data will be held in accordance with our privacy policy

©2020 Clean Up The Internet. 

bottom of page