01-18-2024Article

Update IP, Media & Technology No. 89

Transparency and protection obligations for providers of online platforms under the Digital Services Act (DSA)

The Digital Services Act (DSA) came into force on November 16, 2022. The first rules and obligations from the regulation already apply. The DSA will then apply in full and directly in the EU member states from February 17, 2024.

We have already provided you with an overview of the regulation and the first obligations, the special features in relation to online marketing and the implementation at national level.

From mid-February of this year at the latest, the transparency and protection obligations will apply as additional provisions for providers of online platforms under Art. 23, 27, 28 DSA, which we will discuss in more detail in this article.

I. Scope of application

The provisions of Art. 19-28 DSA apply to online platforms.

Art. 3 lit i) DSA defines an online platform as follows

"hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation".

However, this does not apply, if these services only serve as infrastructure, according to which so-called cloud computing services and web-hosting services are not covered. They are classified as intermediary services, the liability provisions of which can be found in Art. 4 et seq. DSA.

Providers of online platforms that are micro or small enterprises in accordance with Recommendation 2003/361/EC are also excluded from the scope of application in accordance with Art. 19 DSA.

According to this EU definition of SMEs, companies are therefore exempt if they have

  • less than 50 employees
  • and an annual turnover or annual balance sheet total of no more than EUR 10 million.

However, this exemption only applies as long as the aforementioned threshold is not exceeded and companies should therefore check at least once a year whether the conditions for an exemption under Art. 19 DSA are still met or whether the extended transparency and protection obligations of the DSA and in particular Art. 19-28 DSA also apply to them.

The scope of application already includes company websites, but also apps, online platforms and interfaces to third-party providers.

II. Measures and protection against misuse (Art. 23 DSA)

While dealing with users who misuse the platform services was previously the responsibility of the companies themselves and often led to uncertainty, especially when accounts were blocked, the DSA now provides a series of measures that providers can take to protect against misuse of their platform.

A special feature is that it covers users who "frequently provide manifestly illegal content" (Art. 23(1) DSA), but also those who misuse the reporting and complaint functions and "frequently submit notices or complaints that are manifestly unfounded" (Art. 23(2) DSA).

If users now frequently and obviously post illegal content on the platforms, for example in the form of postings, these users must be warned in advance of an intended blocking of their account before their accounts can be blocked for a reasonable period of time in the event of an infringement. This means that the blocking must be preceded by a warning. However, there is no specific time limit as to how far in advance the warning must be issued before an intended blocking.

Specific information on the period within which a warning or blocking must take place was previously contained in the Network Enforcement Act (Netzwerkdurchsetzungsgesetz, NetzDG). The NetzDG stipulated that obviously illegal content must be blocked within 24 hours in accordance with Section 3 (2) No. 2 NetzDG. For other illegal content, a blocking period of seven days applied in accordance with Section 3 (2) No. 3 NetzDG.

The German implementation of the DSA, the Digitale-Dienste-Gesetz (DDG), which is currently available as a draft bill (we reported), will repeal the NetzDG and the German Telemedia Act (Telemediengesetz, TMG). This means that there will no longer be any specific deletion deadlines in future. Whereas providers previously had to take action within 24 hours, in future they will only have to decide " in a timely, diligent, non-arbitrary and objective manner" (Art. 16(6) DSA).

Until now, the Federal Office of Justice (Bundesamt für Justiz, BfJ) has been responsible for monitoring compliance with the NetzDG. In future, this task will be assumed by the "Coordination Office for Digital Services at the Federal Network Agency", which will also be the central point of contact and reporting office for violations.

Protection against misuse of the reporting and complaints function for posts and content is closely related to this. In such cases, accounts can also be blocked for a reasonable period of time after prior warning.

The DSA thus provides a framework for possible restrictions that providers can set out in their respective general terms and conditions or terms of use. Before this regulation, it was sometimes disputed whether the blocking of an account by a provider could constitute an unreasonable disadvantage for the user within the meaning of Section 307(2) No. 1 German Civil Code (BGB), but the DSA now clarifies that a temporary blocking after prior warning is intended as an appropriate reaction and is therefore not unreasonably disadvantageous.

Nevertheless, freedom of expression must be kept in mind before an intended blocking. The DSA attempts to do justice to a possibly unjustified blocking through the characteristic of "obvious illegality".

"Information should be considered to be manifestly illegal content and notices or complaints should be considered manifestly unfounded where it is evident to a layperson, without any substantive analysis, that the content is illegal or, respectively, that the notices or complaints are unfounded." (Recital 63 DSA)

Previously, in accordance with Section 7(4) TMG, a network/DNS block was also possible if other remedies were unsuccessful. The previous Section 7(4) TMG will in future be continued in Section 8 DDG-E. It now covers all digital services that transmit information provided by a user in a communications network or provide access to a communications network. The group of addressees now includes all access providers and not just – as previously – WiFi providers. The regulation thus resolves the conflict with the previous TMG standard, according to which Section 7(4) TMG was to be extended in conformity with European law in accordance with the ECJ judgement (C-314/12).

To resolve disputes between users and providers of online platforms that cannot be resolved by means of internal complaints management, out-of-court dispute resolution may be chosen in accordance with the provisions of Art. 21 DSA. However, a judicial resolution remains unaffected.

III. Transparency obligations when using recommendation systems (Art. 27 DSA)

Prioritised posts and information tailored to the respective users by algorithms in the order in which they are displayed are a key success factor for many platforms and at the same time represent a central component of business activity.

The aim of being able to offer users what they like or are interested in, tailored to their individual needs, is to ensure the longest possible usage time on the platform. However, there is a risk - even involuntarily - of falling into so-called echo chambers.

The term "echo chamber" describes the phenomenon in which media users predominantly absorb information that confirms their own views. Users tend to pay little attention to content that challenges their opinion. As a result, this can lead to the formation of more or less closed networks and filter bubbles in which these one-sided opinions can grow and become entrenched.

However, prioritization based on recommendation systems also serves the platforms as a way of prioritizing their own offers and posts and drawing attention to them, thus creating an advertising opportunity on their own behalf.

Art. 3 lit s) DSA defines a recommendation system as

"a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed".

It is irrelevant whether the information is displayed as a result of a user request in which the user actively searches for recommendations or whether the platform makes recommendations independently. Recommendation systems use algorithms to decide which information is displayed to a user, but sometimes also the order in which it is displayed.

Art. 27 DSA is intended to ensure that users are informed about how the platforms use recommendation systems and can thereby influence the way in which information is displayed on the platform. It is important to ensure that the information on the recommendation systems is presented clearly and in a way that is easy for users to understand. The average user serves as a benchmark so that they can understand how the information they are shown is prioritised.

Not all selection criteria need to be disclosed, but the most important ones must be, together with a brief explanation of why these criteria were taken into account.

However, the DSA does not allow users to influence or even select or deselect selection criteria themselves – i. e. to create an individual, user-specific recommendation system.

Nevertheless, there is a great deal of criticism of the market: the problem of echo chambers is well known and simply providing information about the use and functioning of recommendation mechanisms will not be able to remedy this. However, it can raise awareness among users and possibly help them to question the information displayed – depending on the content.

In the event of non-implementation, a fine may be imposed in accordance with the DDG, which can amount to up to six percent of annual turnover (Section 25(6) DDG-E). In addition, every user has the right to lodge a complaint in accordance with Art. 53 DSA. A claim for damages, which is possible in principle under Art. 54 DSA, requires proof of causal damage – a significant hurdle in practice.

IV. Special mechanisms for the protection of minors (Art. 28 DSA)

The European Union’s political goal of focussing more strongly on the protection of minors is also reflected in the DSA.

"Providers of online platforms accessible to minors shall put in place appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their service." (Art. 28 (1) DSA)

However, providers are not obliged to process additional personal data in order to determine whether the user is a minor, Art. 28(3) DSA.

Whether the platforms have the additional task of protecting minors rather depends on whether the platform itself is aimed at underage users. This may be the case if the platform allows minors to use the respective service in its general terms and conditions or if the provider becomes aware in some other way that its users are minors; for example, if the provider processes the personal data of its users in such a way that the age of the users is revealed.

However, it is even more important to consider whether the service is also actively aimed at minors. However, the focus and purpose of the platform must be taken into account. A non-public platform for skilled tradespeople, for example, which provides explanatory and training videos and materials, may also be accessible to underage trainees, particularly for training purposes. However, special protection of this user group, as is required for social networks in order to guarantee the highest level of privacy, will not be necessary in view of the clear professional focus.

Art. 28 (2) DSA also contains special regulations for personalised advertising to users who are obviously minors. We have already reported on the special features of online marketing under the DSA.

V. Conclusion

The regulations on transparency and protection obligations under the Digital Services Act may seem simple and largely self-evident. However, in view of the imminent expiry of the implementation deadline on February 17, 2024, it is still important to carefully examine whether compliance with these regulations may be necessary for your own platform. Failure to implement or insufficient implementation may result in a fine under the DDG. In addition to or instead of a fine, a violation of the DSA may also be subject to a warning under Section 3a UWG, insofar as market conduct rules are concerned.

 

Download as PDF

Contact persons

You are currently using an outdated and no longer supported browser (Internet Explorer). To ensure the best user experience and save you from possible problems, we recommend that you use a more modern browser.