top of page
Search
  • Writer's pictureDavid Babbs

Parliamentary discussion of online harms, misinformation and abuse – a summer recess 2020 round-up

The UK parliament has now broken for summer recess, so it’s a good time for another round-up of latest developments in parliament.


From the government itself, there has been very little visible progress since the last of these updates in May. At Oral Questions on 4 June, MPs from all the major parties expressed concern and frustration about this slow timetable. The Secretary of State, Oliver Dowden, sought to reassure MPs that work is taking place behind the scenes, claiming that he is “taking decisions on pretty much a daily basis”. However, he again refused to be drawn on the exact timetable, saying only:

I can see that the House is trying to nail me down to an exact date for a Bill that will be introduced in this Session. I can assure him that it will be introduced within the year.”


On 29 June, the House of Lords Select Committee on Democracy and Digital Technologies issued their new report, “Digital Technology and the Resurrection of Trust”. They argue that democracy faces a “daunting new challenge” from the digital media landscape, which they describe as “dominated by two behemoths - Facebook and Google”.


The committee draws on the long-established Nolan Principles for Standards in Public Life, to set out six principles they identify as essential to a healthy democracy: informed citizens; accountability; transparency; free and fair elections; inclusive public debate; and an active citizenry. They argue that in a variety of ways the current state of the digital media landscape undermines each. They structure their 45 recommendations around how digital technology could instead start reinforcing these principles.


They see the Online Harms Programme, together with a significant updating of electoral law, as essential to “protect democracy” from being seriously undermined. They are therefore extremely concerned about the slow progress, and the risk that delay might mean the legislation gets watered down in the face of lobbying from Big Tech:

“It [Online Harms legislation] needs to happen; it needs to happen fast; and the necessary draft legislation must be laid before Parliament for scrutiny without delay. The Government must not flinch in the face of the inevitable and powerful lobbying of Big Tech and others that benefit from the current situation.”

The report includes a section carefully exploring the role of anonymity, identity concealment and identity deception on social media:

"A substantial amount of content on digital platforms is posted by anonymous users who may not be genuine users at all. Indeed, it is sometimes posted by bad actors using sophisticated techniques to spread misinformation and abuse, and to undermine democratic debate. In general, we believe there should be a presumption against anonymity on digital media however we recognise that for many this is not possible. Anonymity can be important to protect freedoms, for example where people from ethnic minority groups want to have a voice in debates but are afraid of retaliation or abuse, where LGBT+ people may not be ready to come out or live in jurisdictions where homosexuality is criminalised, or where journalists and citizens are living in autocratic regimes. However, there is a significant proportion of those who use anonymity who use it to abuse, to troll, to silence alternative views, or to spread hate."

The committee makes recommendations which are very much in line with Clean up the Internet’s own proposals - to restrict abuse of anonymity, whilst safeguarding legitimate and benign uses:

“Users should be empowered to verify themselves; those who wish to be anonymous should remain so, but other users should be able to filter anonymous users out. 419. Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users.”


On 21 July, the DCMS Select Committee’s disinformation subcommittee published their latest report, “Misinformation in the COVID-19 Infodemic”. This report had a much narrower focus than the Lords’ report discussed above, but reaches very similar conclusions about the need for regulation.


The Committee Chair, Conservative MP Julian Knight, argues the tech platforms have failed to respond adequately to the issue of Covid-19 misinformation in terms of the measures they have taken voluntarily or as a result of informal discussion with government. The pandemic has therefore added even more urgency to the case for regulation:

“We are calling on the Government to name the Regulator now and get on with the ‘world-leading’ legislation on social media that we’ve long been promised.
“The leaders of social media companies have failed to tackle the infodemic of misinformation. Evidence that tech companies were able to benefit from the monetisation of false information and allowed others to do so is shocking. We need robust regulation to hold these companies to account.
“The coronavirus crisis has demonstrated that without due weight of the law, social media companies have no incentive to consider a duty of care to those who use their services.”

The committee identifies a range of different actors pushing misinformation, ranging from foreign states and extremists pursuing political agendas, to commercial scams, to ordinary but misguided members of the public. They consider the social media companies to have failed to adequately tackle any of these sources.


They identify a range of serious societal harms which stem from this failure - threats to public health, such as taking false treatments or failing to seek treatment; 5G conspiracy theorists attacking phone masts and telecommunications engineers; and harassment of ethnic minorities who have been accused of "spreading the virus". They conclude not only that tech companies have failed to take sufficient action, but that furthermore it is often against their commercial interests to do so:

“The need to tackle online harms often runs at odds with the financial incentives underpinned by the business model of tech companies.”

A crucial argument made in this report is that the scope of Online Harms legislation must include “legal but harmful” material, such as the coronavirus misinformation so prevalent during the pandemic. In his own evidence to the committee, Oliver Dowden appeared to suggest he favours a narrower focus on illegal content, combined with regulatory oversight of how the platforms enforce their own terms of service. The Committee is clear that “legal but harmful” content must be within the scope of regulation, as originally proposed, and that the legislation should provide for a process to decide which “harms” the regulator should look at.

"We strongly recommend that the Government bring forward a detailed process for deciding which harms are in scope for legislation. This process must always be evidence-led and subject to democratic oversight, rather than delegated entirely to the regulator. Legislation should also establish clearly the differentiated expectations of tech companies for illegal content and ‘harmful but legal’.
These technologies, media and usage trends are fast-changing in nature. Whatever harms are specified in legislation, we welcome the inclusion alongside them of the wider duty of care, which will allow the regulator to consider issues outside the specified list (and allow for recourse through the courts). The Committee rejects the notion that an appropriate definition of the anti-online harms measures that operators should be subject to are simply those stated in their own terms and conditions

On the specific question of anonymity, the committee recognised the contribution of Clean up the Internet’s own research, and was critical of Twitter’s reluctance to engage with the problems posed by anonymity.

In oral evidence, we suggested that, given it amounts simply to a validation of identity, Twitter could offer verification to all users to prevent the blue tick being considered as an endorsement, as well as to tackle anonymous abuse. Research from Clean Up the Internet conducted during lockdown has demonstrated a clear link between anonymous Twitter accounts and the spread of 5G conspiracy theories about COVID-19.

The DCMS Committee concluded, therefore, that questions of anonymity and identity verification should be within the scope of future regulation:

“The new regulator should be empowered to examine the role of user verification in the spread of misinformation and other online harms, and should look closely at the implications of how policies are applied to some accounts relative to others.”


The "Russia Report", by the Intelligence and Security Committee, finally released on 21 July, made the headlines for a number of other reasons, but also offered some relevant insights for the debate about social media regulation. The redacted report clearly identifies social media as a major tool used by Russian-backed "influence campaigns" and disinformation, and is critical of social media companies for “failing to play their part” in tackling this problem.

"We note that – as with so many other issues currently – it is the social media companies which hold the key and yet are failing to play their part; DCMS informed us that [redacted]."

The report also highlights the role which identity concealment - enabled by a laissez-faire approach to anonymity and a lack of identity verification - plays in such influence and disinformation campaigns.


“In this instance, employees of the Russian state and Russian-controlled bots may masquerade as ordinary British citizens on social media and give the UK’s politicians, journalists and other people who may have power and influence the impression – simply via the sheer quantity of posts – that the views espoused are genuinely those of a majority of their country’s public.”


On 26 July, the official Labour opposition issued a new statement criticising the government over the slow rate of progress with Online Harms legislation. Labour Shadow Secretary of State for DCMS, Jo Stevens, noted the recent reports from both the House of Lords Select Committee on Democracy and Digital Technologies and the DCMS Select Committee. She said:

“Social media companies have had repeated opportunities to show they can police their sites effectively. But when high profile individuals are allowed to keep their platforms after spreading vile anti-Semitic abuse – and then doubling down when challenged – it’s clear that self-regulation isn’t working.
“The government promised this bill more than a year ago – it’s high time they showed they take the safety of those who use the internet as seriously as the needs and influence of the big tech firms.”

More detail on Labour’s policy position is expected to follow, but it is encouraging that they are clearly backing tougher regulation and seeking to add to the pressure for legislation to be brought forward.


138 views0 comments
bottom of page