top of page
Search
Writer's pictureStephen Kinsella

What should a new Labour government do about anonymous and fake social media accounts?

 

The new government of Keir Starmer has a host of new MPs and an impressive parliamentary majority. It also faces a daunting list of challenges, and Starmer has acknowledged that “delivery” will be key if he is to retain public support. Here are some initial reflections on how online safety in general, and fake and anonymous accounts in particular, should feature in the new government’s to-do list.

 

The Labour Party came to power placing a very strong emphasis on delivering on 5 key “missions” and on the 6 “First Steps” it will take in power. These missions, and first steps, are reflected in its first King’s Speech this week, which groups planned legislation under these headings.  Nobody would have expected anonymous or fake social media accounts to feature as a headline “mission” or “first step”. But it is clear that requiring social media platforms to act to reduce harm from fake and anonymous accounts would be in service of several of the missions. Conversely, a failure to address the problems with fake and anonymous accounts has the potential to create obstacles to delivery.

 

Labour’s number one mission of kickstarting economic growth will certainly be helped by a reduction in fraud, which is currently the UK’s most commonly experienced crime. Fraud harms the wider economy by denting consumer trust and confidence and diverting billions away from legitimate economic activity. The last government, which was prone to understating the impact of fraud because it had grown so significantly on its watch, estimated in 2019-20 that the annual cost to society of fraud in England and Wales was £6.8 billion.

 

Fake social media accounts are a key enabler of fraud - our research failed to identify a single form of fraud on social media which didn’t make use of fake accounts. Reducing fraudsters’ ability to use fake accounts to target and deceive victims would have clear economic benefits - as well as reducing the number of victims, and the huge amount of distress this causes to them, and the burden on overstretched law enforcement.

 

Our proposed solution to the misuse of fake accounts is to give every social media user options to verify their identity, to make that verification status easily visible, and to offer users options to filter out non-verified accounts. Implemented correctly, such measures to promote user identity verification on social media also have the potential to drive take-up of digital identity products and to accelerate the development of a digital identity verification ecosystem in the UK. This would be good for growth, and could also act as what the Tony Blair Institute describes as a “great enabler” of public service reform. The King's Speech recognises the pro-growth potential of enabling greater use of digital identity products. In laying out plans for a Digital Information and Smart Data Bill, the accompanying briefing paper notes that “The economic benefits of secure digital identities being in widespread use around the UK were estimated to be around £600 million per year.” This figure obviously doesn’t include the crime reduction benefits that would result if such identity products were used on social media.

 

Being able to generate networks of fake accounts easily and at scale is a key platform functionality exploited by bad actors to seed and spread disinformation. Labour’s second mission, of making Britain a “clean energy superpower”, and thus removing reliance on fossil fuel imports, is a major potential target for such disinformation. Decarbonisation will require new infrastructure to be built across the country - and Labour is likely to find that a lot more challenging if hostile states such as Russia (as well as unscrupulous fossil fuel lobbies) are able to use networks of fake accounts to spread conspiracy theories and whip up doubt and division about net zero.

 

Crime reduction is mission number 3 - and whilst this mission was given the offline heading of “take back our streets”, the greater detail of the election manifesto made it clear that this included crime with an online element. The manifesto specifically mentioned action to stop social media “platforms being exploited by fraudsters”. But anonymous and fake accounts are enablers of a far wider range of serious crimes than just fraud, including child abuse, human trafficking, stalking, harassment and hate offences. Restricting the ability of bad actors to exploit fake and anonymous accounts is one way to tackle illegal activity further upstream, which reduces the number of victims and the burden on law enforcement.

 

The links between online safety and delivering on Labour’s missions were in part recognised in this week’s King’s Speech, which included plans to introduce a few new pieces of relevant legislation. The Digital Information and Smart Data Bill, not only includes the already-mentioned measures to support the development of digital identity products and services, but also includes new powers for Coroners to access information from online services which may be relevant to an investigation into a child’s death. A Product Safety and Metrology Bill would introduce some new regulation of online marketplaces. A Cyber Security and Resilience Bill recognises the links between online crime and hostile state actors.

 

However, there was no headline “Online Safety Act part 2”, as the then Shadow Secretary of State Lucy Powell had suggested would be necessary when the Online Safety Act finally passed, with reduced ambition, in 2023. We were not surprised by this. Whilst the OSA is far from perfect, it’s widely acknowledged that Ofcom is also being too cautious in enforcing the Act. It therefore makes sense for the immediate priority to be to encourage Ofcom to make fuller use of the powers it has so recently been given.


We would suggest that the new DSIT Secretary of State issues a clear strategic steer to Ofcom to do more, more quickly, to tackle fake and anonymous accounts. Ofcom itself has described its first proposals as first “iterations”, and accepted that it will need to “iterate up”. The strategic steer from a new Labour administration should be to make the first iteration as strong as possible, and then if still necessary to “iterate up” at speed.

 

Later in this parliament, Labour may well need to consider further legislation. This may include tightening up the wording of the Online Safety Act’s user identity verification provisions - to set minimum standards, to require interoperability and promote the development of a market in verification providers, and to ensure users’ verification status is visible to other users. But it could also include bolder measures such as incentivising platforms to act against scammers exploiting the ability to use fake accounts, by introducing a degree of platform liability for fraud perpetrated by accounts for whom the platform is unable to provide a link to a real-world identity.

 

The new DSIT team has a lot on its plate. As well as leading on two pieces of legislation set out in the King’s Speech, the Department is being expanded to include the Government Digital Service, the Central Digital and Data Office, and the AI Incubator. The stated aim is for DSIT to be the digital centre of government, and drive collaboration and innovation in digital delivery of services across the civil service. Online Safety will be key to the success of a lot of this new agenda - because if the government wants the public to embrace new, digital ways of doing things, then the public will need to feel safe online. In improving online safety, DSIT has the advantage of a regulator which has already been established and already given some new (albeit perhaps ultimately insufficient) new powers.  It makes sense for DSIT’s first move to be to steer the regulator to make the most of what it has got. We look forward to engaging with the new team.


0 comments

Comments


bottom of page