top of page

EU Commission’s enforcement action against X, and latest Democracy Shield plans, both indicate growing recognition of importance of action against inauthentic and deceptive accounts

  • Writer: Stephen Kinsella
    Stephen Kinsella
  • 3 hours ago
  • 4 min read

The European Commission this month announced its first enforcement fine under the Digital Services Act (DSA). X was fined €120 million for non-compliance on a number of fronts with its DSA obligations. This included its monetised blue tick scheme. An EU spokesperson was very clear about the problems with the scheme: “It deceives users. Anyone can pay to obtain the verified status, and X does not meaningfully verify who is behind it.”


The Commission took some time before moving to enforcement – it’s two years since it announced its investigation, and over a year since it announced preliminary findings that X was in breach. Other areas of the Commission’s investigation into X, regarding X’s failure to take adequate steps to tackle the dissemination of illegal content and to combat information manipulation, remain ongoing. On the same day as announcing a fine for X, the Commission announced that a parallel investigation into TikTok had been concluded with the Commission accepting TikTok’s binding commitments to bring itself into compliance. The EU spokesperson explained the approach as this: “Our objective is not a fine. If you engage constructively with the Commission, we settle cases. If you do not, we take action.”


Clean Up The Internet welcomes the Commission’s decision to proceed with enforcement in the case of X, and to demonstrate that it is willing, as a last resort, to levy substantial fines. It’s understandable that the Commission displayed a certain caution in doing so, particularly given the Trump administration's stance towards regulatory action against US big tech companies. But ultimately regulations like the DSA and the UK’s OSA were enacted because of a recognition that we could not expect platforms to clean up their acts voluntarily. It would be naive now to expect the mere existence of the regulation to deliver change - regulators need to be willing to show they have the stomach for enforcement action, including against the largest and most powerful companies. 


In the UK, Ofcom could learn something from the Commission’s approach. Having claimed that one tangible benefit of Brexit would have been the ability to move faster, the UK has instead fallen behind. Ofcom also likes to talk up its desire for constructive engagement with the platforms, but it has yet to demonstrate its willingness to move to enforcement against the bigger platforms where this fails. Ofcom’s enforcement actions have so far targeted smaller platforms and pornographic sites - which is all very well, but is unlikely to convince the largest platforms that it’s also willing to get tough with them.


Clean Up The Internet also welcomes the Commission’s specific findings regarding X’s blue check scheme. We warned 3 years ago, when X (newly acquired by Musk, but still called Twitter at that stage) first introduced paid-for blue ticks, that “Musk’s Twitter isn’t actually offering its users a genuine option to verify their identity” and that “conflating a symbol which indicated authenticity and notability with merely reflecting payment of an $8 subscription [has] opened up opportunities for bad actors”. We argued that, whilst robust and accessible user identity verification could deliver significant benefits, not only would X’s scheme fail to deliver these benefits, it would actually make the situation worse. The EU Commission has essentially agreed with this analysis.


Alongside its welcome moves to tackle X’s deceptive and dangerous fake verification scheme, the Commission has also recently taken some encouraging steps towards promoting genuine user verification schemes as a potential measure to tackle the use of fake accounts for foreign information manipulation and interference (FIMI) and Disinformation. 


We’ve previously suggested to the Commission that user verification measures could be included within its Electoral Integrity Guidelines, or within the Code of Practice on Disinformation, both through submissions to consultations and constructive meetings that we have attended with numerous officials in Brussels. In their latest communication on the Democracy Shield Initiative, the Commission acknowledges significant threat  to the integrity of the EU information space posed by fake accounts, and suggests that verification measures along the lines we have previously suggested could be introduced under the Code of Practice on Disinformation:

“The Commission will also explore possible further measures with the Code’s signatories. These could include ways to improve the detection and labelling of AI-generated and manipulated content circulating on social media services and voluntary user verification tools. Such measures would be complementary to the AI Act and other relevant EU rules. The EU Digital Identity Wallets, which will be available for EU citizens and residents by the end of 2026, could facilitate such measures and promote trust and security in online interactions by enabling secure identification and authentication.”

This is very encouraging. We have recently joined the Code of Practice on Disinformation as a civil society signatory, and are looking forward to participating in the process of exploring how such measures could be added to the Code.


Clean Up The Internet will be urging the UK government to draw some lessons from this. The UK’s “strategy for modern and secure elections”, published in July, identifies similar threats to those which the EU’s “Democracy Shield” seeks to address, acknowledging that “our hard-won democracy is increasingly threatened by dirty money and digital threats” and that “our firewall to protect against their interference has not kept pace”. Yet the strategy is astonishingly thin on measures to address digital threats, limiting itself to tightening up the rules on imprints on digital content. We’ll be encouraging the government to tighten up rules on platforms to protect electoral integrity, and suggesting the upcoming Elections Bill would be an obvious place to include such rules - for example by adding an Electoral Integrity Code of Practice into the OSA.

Join our mailing list

Your data will be held in accordance with our privacy policy

©2020 Clean Up The Internet. 

bottom of page