EU Digital Services Act
EU

What do the guidelines under the EU Digital Services Act (DSA) contain?

Audio Reading Time:

According to information obtained by Tomorrow's Affairs, the guidelines for the implementation of the EU Digital Services Act, which are currently in the discussion phase, will be adopted by the end of June.

This is a crucial moment for the digital space in Europe. After these guidelines were published, it was clear that online platforms must change how they manage content and protect users. The new act sets deadlines and practices that have not yet been precisely defined.

The Digital Services Act came into force on 17 February 2024. Since then, the platforms have adapted their internal guidelines to remove illegal content and protect users.

However, in the absence of detailed guidelines, there has been uncertainty about how the national regulatory frameworks should work.

The guidelines, which are due to be adopted by the end of June, describe the steps that platforms must take. What exactly do these guidelines say?

Protecting minors

First of all, it has been established that in cases of child pornography, terrorism or hate speech against sensitive groups, there is an obligation to remove the content within 24 hours of it being reported. For all other types of illegal content, the deadline is seven days.

These categories have the highest priority, as they directly threaten the lives and fundamental values of society.

The removal of content must be based on a report that is clearly justified. The platform is not obliged to remove content if the report has no legal basis and does not provide accurate information about where this content is located online.

It is precisely for this reason that the guidelines adopted yesterday clarified that the report must contain information about the URL address, the type of infringement and the nature of the infringement.

Platforms that fail to remove content within the prescribed deadlines face fines of up to two per cent of their annual global turnover

Platforms that fail to remove content within the prescribed deadlines face fines of up to two per cent of their annual global turnover. The aim is to create a uniform standard for all member states.

A significant part of these guidelines is dedicated to the protection of minors.

Until now, platforms have been under general pressure to restrict children's access to inappropriate content, but the choice of implementation method is left to each of them.

It is now stipulated that content explicitly related to the sexual exploitation of children must be blocked within four hours of the competent authority receiving the report.

This means that any video clip or photo containing child pornography must be removed automatically and immediately. In addition, the recommendation algorithms must be adapted so that they do not display violent or gambling content to minors.

To fulfil these requirements, platforms should develop or further improve their mechanisms for detecting such content.

In practice, this means combining automatic detection systems through shape recognition and metadata with the involvement of human moderators who review each reported case within a reasonable time.

This reduces the risk of errors that automatic filters can make while ensuring a rapid response. The platforms have a deadline to implement these measures by the start of the school year, which means that this deadline is mid-September 2025.

Extending a systemic risk assessment obligation

Until now, the systemic risk assessment was only mandatory for the largest platforms, which fall into the very large online platform (VLOP) category according to the DSA criteria.

The guidelines adopted yesterday extend the assessment obligation to all platforms with a significant number of users in the EU.

The platforms must submit the first risk assessment reports by 1 September 2025 at the latest.

These reports will include an analysis of the impact of algorithmic decision-making on social processes and democracy, particularly with regard to the spread of disinformation. The aim is to determine how to reserve powerful content filters for extreme ideas and prevent social polarisation.

Assessments will be carried out by independent auditors certified in the ethics of digital technologies

Independent auditors certified in the ethics of digital technologies will carry out the assessment, according to the guidelines.

The document assumes that a quantitative and qualitative approach is required. Quantitatively, the relationship between the number of requests for content removal and the number of items actually removed is assessed.

Qualitatively, it examines whether the platforms have developed internal procedures to respond to community feedback and if they regularly analyse their potential impacts on the election process. All of this must be completed by the end of 2025.

Political content and paid advertising will be subject to entirely new regulations. Platforms have until 31 October 2025 to set up a standardised registry for all ads related to political campaigns.

The register must show the advertiser's identity, the campaign's total budget, and the targeted users' demographic parameters.

This is to significantly increase transparency and reduce the possibility of manipulation during the election process. The question is whether the platforms will meet this deadline, as it is technically challenging to consolidate the data from several member states.

The structure of the national digital coordinators is now clearly defined. Each member state is obliged to set up a rapid response mechanism.

If there is an urgent report of content with terrorist or paedophile elements, the coordinator can order its removal without further delay. This means that in extreme cases, a court order is no longer required.

The platforms must use an electronic data exchange system known as DSP IMS to enable an equal and secure exchange of information between the various regulatory authorities.

Stricter procedures

In addition, stricter procedures now apply for the out-of-court settlement of disputes. Platforms are required to establish an out-of-court dispute resolution body and use a standardised form to receive complaints.

The user must explain what led them to believe that the moderation of the algorithm was wrong and demonstrate how this has affected their rights or freedom of expression.

The arbitration body has seven days to review the platform's decision and recommend a correction. This is intended to achieve a balance between effective protection and the right to appeal.

The platforms will publish quarterly reports on the implementation of the guidelines available to public

Over the next six months, the platforms will publish quarterly reports on the implementation of the guidelines and monitor them.

The data that the public will be able to see includes the number of requests to remove illegal content, broken down by category, the number of items removed, and the frequency of unsuccessful complaints.

This kind of transparency will allow analysts and human rights experts to assess how effective the guidelines really are.

At the same time, the media will have access to the data to check for problems in certain countries where regulators are unwilling to carry out constant monitoring.

One of the greatest challenges will be the implementation of the guidelines for the protection of minors. Platforms will need to strengthen moderation and algorithmic filters. This will require the development of new software tools to recognise such content.

Platforms are now expected to implement solutions that were previously only available as part of well-funded projects. Small platforms will struggle to meet the deadlines, but a new industry of certified companies offering compliance services will emerge.

A new era for digital platforms in Europe

The EU's relationship with the American technology giants is another dimension. The large companies have already expressed not only their dissatisfaction with the requirements but also their desire to fulfil their commitments because the European market is of strategic importance to them.

Informal talks between representatives of the EU and the US took place in May. It was agreed to harmonise some basic definitions by the end of the year, such as the criteria for hate speech and violence.

On this occasion, only the principle of better co-operation was reached, but not a date for the formal recognition of some certificates that would facilitate the work of companies.

EU Commission
While the EU is taking a firm stance on the protection of users and minors, the question is whether other major powers will follow suit or opt for a less restrictive approach - European Commission

For small and medium-sized businesses, harmonisation entails additional costs. They will have to hire a legal service that is well versed in the guidelines, but on the other hand, they will gain more credibility with users.

Complying with the legal framework means gaining the trust of consumers and investors. E-commerce platforms will be given clear guidelines on how to remove fake reviews and pirated content.

In the member states, the regulatory authorities must communicate effectively with the platforms. It is expected that the first round of inspections will be organised in the second half of 2025 to verify the implementation of the guidelines adopted yesterday. The inspections will cover the five largest member states.

The regulators will spend days examining technological solutions, interviewing staff responsible for moderation, and analysing reports. The aim is to identify gaps and propose corrective mechanisms for the coming year.

The guidelines that came into force yesterday mark the transition from the phase of defining the rules to the phase of actually applying them. The global network of digital services is following the example set by the EU.

There are already initiatives in the US Congress to pass a law on the transparency of algorithms. Something similar is being announced in Asia, particularly in India.

While the EU is taking a firm stance on the protection of users and minors, the question is whether other major powers will follow suit or opt for a less restrictive approach.

The guidelines introduce a series of precise rules that will place digital platforms in Europe under strict supervision.

It's not just about removing content. It's about establishing accountability and transparency in a way that hasn't existed before.

Soon, the practical application of these principles and the resulting changes to the digital market will become evident. Everyone will be watching to see whether judicial authorities and constant monitoring are enough to stop the spread of disinformation or protect the most vulnerable groups.

What is certain is that the digital age in Europe has entered a new era in which the responsibility of platforms will be paramount.

Source TA, Photo: Shutterstock