top of page
  • Writer's pictureDavid Babbs

Parliamentary discussion of Online Harms, anonymity and disinformation continues during lockdown

No one would expect the Online Harms agenda to be the top priority for government or parliament during a global health emergency. Ministerial and Civil Service capacity has been refocused on the pandemic, and the parliamentary timetable has been disrupted. It was inevitable that this would impact on the government providing its full response to the Online Harms White Paper consultation, which we’d been told in February we could expect around now.

On the other hand, however, the pandemic has also reinforced the importance and urgency of tackling Online Harms. Social distancing has driven more and more of our lives online – including parliament and democratic discussion. If there was any debate still to be had as to whether digital platforms were really that important to democratic debate, or to what extent large social media platforms such as Twitter or Facebook constitute part of the “public square”, surely the pandemic has settled the question. Simultaneously, the spread of disinformation across social media, undermining public health messages, and apparently fuelling arson attacks on mobile phone masts, demonstrates clearly how a dysfunctional digital public square poses risks to society as well as to individuals.

Caroline Dinenage, Minister of State for Digital and Culture in the Department for Digital, Culture, Media and Sport, gave oral evidence to the Lords Committee on Democracy and Digital Technologies on 12 May. She acknowledged that the “experience of covid has brought into sharp focus why this is important". She was however, not able to give firm dates as to when the delayed follow-up to the consultation, or the legislation, would be forthcoming. She repeated this position a day later when she appeared before the Home Affairs Select Committee on 13 May, willing to say only that “our ambition is for it to happen in this Session”, and that there remains an “intention” to include a stage of pre-legislative scrutiny.

Despite this unwillingness to commit to a firm timetable, Caroline Dinenage did appear keen to convey a commitment to delivering the legislation. She emphasised to the Lords committee that the government remains "really committed to the Online Harms bill", that it’s something she is "personally really passionate about", and that she’s "absolutely determined to introduce legislation as soon as possible". She mentioned the need for a “duty of care” and a “regulator with a set of sanctions”, and repeated that the government is minded to appoint Ofcom. She struck a similar tone with the Home Affairs Select Committee, stating that, “it is such an urgent piece of legislation for us. Both the DCMS and the Home Office are really keen to accelerate it as much as we can.”

Dinenage’s recent statements contrasted somewhat in tone with her Secretary of State’s comments to the DCMS Select Committee on 22 April, and were perhaps intended to provide reassurance. His description of an Online Harms bill merely as “legislation that we are looking at bringing forward” didn’t suggest a huge sense of urgency. His description of the function of a regulator as being to tell the tech companies “Just stick by your terms and conditions”, and constant references to the risks of over-regulation suggested a distinct lack of enthusiasm for getting tough with tech companies. The committee chair, Julian Knight MP, observed to Dowden that “the impression that this Committee has is effectively that the Government are starting to row back on this”. Dowden’s comments sparked fears, voiced in the Daily Telegraph, that the legislation could be delayed until 2023 and that “tech giants will exploit the delay to lobby ministers to water down the plans”. The extent to which there are differences of substance as well as differences of tone between Dowden and Dinenage remains to be seen.

Clean Up The Internet is taking a keen interest in the Online Harms agenda because we see it as one of the most promising routes for tackling the misuse of anonymity on social media - alongside other platform design flaws which fuel incivility, abuse and misinformation. It would be possible for social media companies to implement voluntarily such changes to the design of their platforms, but they have failed to do so. We see the idea of a “duty of care” as potentially very helpful in this regard – if correctly designed, it could require social media companies to demonstrate to a regulator how they have mitigated risks of harm, such as those posed by anonymous users, through the design of their platforms. We set out our thinking about this in much more detail in a report published last month.

From this perspective, we have been encouraged by the extent to which the issue of anonymity has now been taken up by a range of MPs and Peers, and that it is clearly an issue of cross-party concern. Parliamentarians from all the committees engaged in the Online Harms Agenda raised the issue of anonymity, including with references to Clean up the Internet’s report and proposal. John Nicolson MP asked the Secretary of State directly about our research, and secured an acknowledgement that “if people feel that they can act anonymously, they will act in a more aggressive fashion. I certainly see it in respect of correspondence I receive and engagement I receive on social media”. Despite his more lukewarm tone towards regulation, Dowden did appear to accept that this issue was within the scope of the Online Harms agenda, saying “it is exactly those sorts of issues that we seek to work through, and in respect of the online harms we are examining those kinds of issues”.

Dinenage also accepted, when questioned by Diane Abbott MP and Tim Loughton MP, that “in many cases [anonymity] can be harmful”. She indicated more broadly that she saw the need for a regulator to operate at the systemic level, and interrogate the design choices made by the platforms, of which their approach to anonymity would be an obvious example. She explained that she thought a regulator should be able to “look at the design choices that some companies have made and be able to call those into question”.

Both Dowden and Dinenage indicated that anonymity was a subject of live discussion, with Dinenage describing herself as “wrestling” with it. Both highlighted that anonymity can, in other circumstances, be benign and important for freedom of expression. This is obviously correct, although it was unfortunate that neither minister has yet recognised the potential for solutions which seek to restrict abuse of anonymity while preserving its availability for legitimate uses. In practice, there is considerable scope for action in this area which does not not threaten freedom of expression or whistleblowing.

It does at least seem clear that very few of the parliamentarians on these various committees are happy with the laissez-faire approach to anonymity which the social media platforms currently follow. When the social media companies appeared before the DCMS Select Committee, this point was put across to them very strongly. Jon Nicolson MP challenged Katy Minshall of Twitter very directly on the issue of verification, asking, “There is a lot of venom and poison on Twitter and one of the big problems is the lack of verification. Why do you not verify people’s identities?”, Philip Davies challenged her to “tell me how many of the problems stem from bots and anonymised accounts on Twitter.”

The committee chair, Julian Knight MP, was not at all convinced by Ms Minshall’s answers on behalf of Twitter, or those of her colleagues from Google and Facebook. He issued a statement afterwards, complaining that “The defensive position demonstrated by the representatives sent by Twitter, Facebook and Google was deeply unhelpful and failed in clarifying what they are doing to tackle the threat posed by record levels of misinformation and disinformation online about Covid-19, some of it deadly. The lack of information and detail in the answers we were given showed a disregard for the important process of scrutiny.” He has subsequently written to them requesting more factual responses.

It was perhaps a bit unfair on Twitter for them to bear the brunt of the questioning around unverified and inauthentic profiles. Facebook, who appeared next, have been repeatedly shown to have similar levels of deception notwithstanding their so-called “real name policy”. It's important, in making the case for systemic regulation, to recognise that the problems associated with the abuse of anonymity and pseudonymity are found on all the major social media platforms. Different approaches to managing anonymity and verification on social media have the potential to reduce levels of abuse and misinformation across the board, not just on Twitter.



bottom of page