top of page
  • Writer's pictureDavid Babbs

Our submission to the European Commission's DSA consultation

Clean Up the Internet recently made a submission to the European Commission’s consultation on the planned Digital Services Act (DSA). The entire submission to the consultation can be downloaded as a PDF here:

Clean up the Internet DSA submission
Download PDF • 149KB

Here is a summary of some of the key points from our submission:

We believe there is an urgent need for systemic regulation of the large tech platforms, and welcome the EU’s proposals to act.

The primary focus of regulation should be the companies and the platforms they build and operate - not individual users or pieces of content. Effective regulation can drive systemic changes to the design and operation of social media platforms, by requiring platforms to address the flaws which create an environment in which abuse is so prevalent and so difficult to avoid, and which provides such a fertile context for false information to be created, spread, and believed.

We believe that systemic regulation, with appropriate safeguards, has the potential to be beneficial to freedom of expression. This is because, firstly, the status quo is very bad for freedom of expression – there is overwhelming evidence that online abuse disproportionately impacts women and minorities, and harms their ability to participate fully in the digital public sphere. Secondly, a focus on companies, and the design of the platforms they operate, can correct an over-reliance on content take-down and moderation measures. An undue reliance on such content-focused measures places too much power over freedom of expression in the hands of a small number of corporations and their opaque algorithms. Thirdly, systemic regulation can also introduce greater transparency and accountability into platforms' moderation policies and processes.

Regulation should be systemic, requiring platforms to design and operate their platforms with a view to reducing levels of harmful and or/risky behaviour (such as abuse, threats, or disinformation) and promoting positive societal outcomes (such as informed debate, diverse and inclusive participation, free expression). Platforms should be held to account for the decisions they take and the impact those decisions have, whilst retaining freedom to operate and innovate in distinct ways. Key principles for regulation should include:

  • Risk management and harm reduction by design

  • User choice and user empowerment

  • Diversity of users and inclusivity of platform environment – especially inclusion of already marginalised and/or disadvantaged groups

  • Transparency and independent scrutiny

  • Protection of fundamental rights including freedom of expression and assembly

A sectoral regulator should consider both illegal and "legal-but-harmful" harms, and be empowered to consider a wide range of harms. This should include:

  • Harms towards individuals

  • Harms towards vulnerable groups

  • Harms to public health

  • Harms to the integrity of the democratic process

  • Disinformation and misinformation

A key EU policy objective should be safeguarding a healthy online "public sphere". Social media platforms are now where millions of citizens access information, exchange views, and engage in debate with each other and with journalists and politicians. The health, or otherwise, of such spaces has profound implications for the health of EU democracy. Individuals or vulnerable groups excluded from such spaces are excluded from democratic participation. Harmful content, such as hate speech or Covid-19 misinformation, can threaten the whole of society.

Another, related, policy objective should be tackling identity deception and misuse of anonymity, and promoting authenticity and trust online. There is a clear body of evidence linking anonymity and identity concealment to the production and spread of disinformation, and to higher levels of abuse. However, anonymity can also be important to safeguarding freedom of expression, for example in the case of whistle-blowing. Anonymity is therefore a risk factor which requires proactive, careful management. Regulatory supervision of how tech platforms mitigate the risks associated with anonymity would reduce harm and create more healthy, positive virtual spaces.

Clean up the Internet intends to continue to engage with the DSA as it progresses.



bottom of page