12-19-2022Article

Update Data Protection No. 127

The Digital Services Act: First obligations to be implemented by February 2023 already

The Digital Services Act ("DSA") came into force as an EU regulation on November 16, 2022. All obligations under the DSA must be implemented as early as February 2024 - some even as early as February 17, 2023. Even smaller online companies must therefore quickly familiarize themselves with the new rules and respond if they want to be DSA-compliant in good time; large online platforms and search engines must take action even sooner.

Small and medium-sized companies have often not yet dealt intensively with the new regulations. However, since the first measures must be implemented in less than two months, it is important to act quickly - at best with the help of external assistance, so that they do not overlook any violations of the new regulation in the short time available, which can lead to fines of up to 6 % of global annual sales.

A. Overview

The DSA is intended to completely re-regulate the use of digital intermediary services and strengthen fundamental rights and consumer protection by creating new liability rules for illegal content posted on these platforms. In addition, there are new strict due diligence requirements, the disregard of which can be punished with fines. Other objectives of the DSA are to promote innovation and growth as well as the competitiveness of small platforms, SMEs and start-ups so that they do not have to bow to the rules of so-called digital gatekeepers.

As was already the case under the Telemedia Act (TMG) and the eCommerce Directive, which the DSA partially replaces, providers of digital intermediary services are not themselves liable if their users post illegal content or offer illegal goods and services. Only when providers become aware of the existence and illegality of such illegal content do they have to take action. However, the operators will not be subject to an independent monitoring obligation in the future (Art. 8 DSA). However, a notice-and-takedown procedure must be made available for digital content, which users can use to report such content and which can then be used to take effective remedial action. In principle, little will change for operators here; only official deletion and information orders will now be based on a standardized procedure. This liability privilege remains in place even if the intermediary service becomes aware of illegal content as a result of voluntary investigations.

However, the condition for the liability privilege for illegal content is in any case that the service offered is neutral and is carried out through the purely technical and automatic processing of information provided by the users. It is therefore not applicable if the contributions are checked, filtered or changed again by the online mediation service.

In the event of violations, the users affected by the violation are not only threatened with fines and claims for damages. The authorities are also given comprehensive enforcement powers, which can extend to the closure of an online service in the event of repeated violations.

B. Who has to act?

The DSA covers all online intermediary services as well as very large search engines offered in the EU internal market, even if the provider's registered office is outside the EU (market location principle, known from the GDPR).

In principle, only commercial services are covered that have an infrastructure network and forward or store information for their users that has been made available by them. This includes, for example, Internet providers or domain name registrars, cloud and web hosting services, online marketplaces, app stores, social media platforms or search engines.

The DSA differentiates between the three forms of "pure transmission", "caching" and "hosting". Pure transit occurs, for example, with Internet exchange nodes, wireless access points, virtual private networks (VPN), DNS service and DNS resolvers, re-registration points or Internet voice telephony (VoIP). Caching is particularly present when networks are operated solely for content delivery, as well as when proxies and reverse proxies are provided for content adaptation. Hosting services can be cloud computing services, web hosting services, paid referencing services, or services for online exchange of information and content.

Particularly important is currently the definition for online platforms These must already determine their monthly active users by February 17, 2023, make these figures available on their website and transmit them to the EU Commission (Art. 24 para. 2 DSA). Thereafter, this report must be repeated every 6 months,

Online platforms are all hosting services that store and publicly disseminate information on behalf of their users. In addition to social networks, app stores and collaborative economy platforms, these include, above all, online marketplaces where consumers and entrepreneurs are brought together to conclude distance contracts. Online forums may also qualify as online platforms, but not comments on online news services. Explicitly not covered are cloud computing or web hosting services where the public dissemination of information is only a secondary function or which serve only as an infrastructure (i. e., for example, software as a service).

Privileging of smaller platforms

Small and very small companies with fewer than 50 employees and annual revenues of less than EUR 10 million are generally exempt from the obligations for online platforms and from the general transparency reporting obligations. The latter also includes the above-mentioned obligation to report the monthly active users. However, upon request by the Commission or the competent authority, small online platforms must also be able to provide this information. Other measures may be adopted by small companies on a voluntary basis.

C. Likelihood of confusion? Differences to the NetzDG

When reading the DSA, it is noticeable that many provisions from the DSA are similar to the NetzDG, which only came into force in 2017. However, German companies in particular should not get the idea that not much has changed. In fact, there are important differences between the two laws, such as the different scope of application: The NetzDG only regulates social networks, while the DSA applies to all online intermediary services (see above). Also, a minimum number of users is no longer possible for the application of the DSA. Each company must therefore reassess whether it faces new obligations under the DSA.

The definition of illegal content also differs from the illegal content of the NetzDG: Illegal content is not only present if it violates certain criminal norms, but in the case of any violation of the law of a member state or the EU. This includes fake news, hate postings, disinformation and goods that are subject to a sales ban. In individual cases, the question of whether a post is illegal content will be difficult to answer without legal expertise, as the national laws in the EU member states differ considerably in some cases.

Other innovations include the specific definition of requirements for the complaint procedure in the DSA and the departure from rigid deadlines for deleting illegal contributions or for processing complaints from users affected by blocking. Nevertheless, many of the provisions of the NetzDG can be found in similar form in the DSA, such as the obligation to prepare transparency reports or to report criminal offenses.

Since the DSA nevertheless goes beyond the NetzDG (also with regard to penalties and fines), providers of online platforms should not rest on their laurels if they have already examined and implemented the provisions from the NetzDG.

D. The concrete implementation obligations   

In addition to the obligation to report user numbers of online platforms, the DSA contains comprehensive due diligence obligations, the scope of which depends on the type and size of the digital mediation services. A clear but very concise overview can be found at the EU Commission. We explain in more detail below what needs to be done. Nevertheless, implementation remains complicated because, for example, in the case of services that include multiple functionalities, different requirements must be implemented for the various functionalities. The safest way is therefore to seek legal advice and, at best, to take advantage of a whole package for DSA compliance right away.

I. All intermediary services (Art. 11 to 15 DSA).

First, the DSA establishes due diligence requirements to be implemented by all digital switching services that fall under the definition outlined above. In the future, intermediary services, such as Internet exchange nodes, wireless access points, virtual private networks (VPN) and DNS, will be exclusively obligated under Art. 11 to 15 DSA. The following points are to be implemented:

  • Transparency reports must publicly report at least once a year on the number of regulatory and court orders received, measures taken, content moderation, and automated means used for moderation.
  • Terms of use: In the future, general terms and conditions must take equal account of the (EU) fundamental rights of all parties involved, including, for example, users' freedom of art and expression. In addition, information must be provided on the moderation of content by the mediation service, i.e., on the guidelines, procedures, measures and tools for restricting user content. Currently, there are no models for this, and the balancing of fundamental rights can be challenging even for lawyers.
  • Contact point for authorities and users: All intermediary services must designate a contact point and make the contact details easily accessible. If the provider does not have a branch in an EU state, it must also designate a legal representative in the EU to serve as a contact point for authorities and users. In particular, the intermediary service must provide information on the languages used for communication. Provided these are explicitly identified, chatbots are also suitable for communicating with users.
  • Cooperation with national authorities: Intermediary services must be able to respond to orders from authorities and courts regarding illegal content and block it if necessary.
II.    Hosting services (Art. 16 to 18 DSA)

By contrast, tighter rules apply to hosting services, such as cloud and web hosting service providers and online platforms, with additional rules applying to the latter (see below). All hosting services must implement the following:

  • Notification and remediation procedures for users and third parties (Notice and Takedown): Hosting services must have systems in place for easy, user-friendly and digital reporting of illegal content. At a minimum, the following data must be collected: Reason, location of the information (e.g., the URL), name and e-mail address of the person reporting the information, and a statement that the person reporting the information is convinced that the information is accurate. It is conceivable, for example, that an online form could be provided. If hosting services remove content that they have identified as unlawful, they must notify the affected user and provide clear and specific reasons. Notifications are to be processed "in a timely manner," but there are no specific deadlines.
  • Reporting of criminal acts committed on the service.
III. Online platforms (Art. 19 to 32 DSA)

Covered by Art. 19 to 32 DSA are all intermediary services that bring sellers and customers together, i.e., in particular online marketplaces (see above) For those online platforms, significant obligations are added under the DSA:

  • Complaint and redress mechanism: in addition to the reporting procedure by which users can flag illegal goods, services or content, platforms must also offer the possibility to challenge decisions of the platform on removed illegal content (for at least 6 months). If the challenge is successful, the platform must be able to restore the content that was wrongly removed as illegal. Complaints must not be decided automatically and reasons must be given.
  • Out-of-court dispute resolution must be possible in connection with decisions by the platform to block posts, and users must be informed of this (for example, in the terms of use and in connection with the redress mechanism). Dispute resolution shall then generally take place within 90 days.
  • Trusted flaggers: upon request, trusted flaggers can be designated by the member state coordination body, whose reports of illegal content or goods are given priority. These whistleblowers must have special expertise in identifying illegal content, be independent of the platforms and provide annual reports.
  • Transparency of recommender systems (suggestion algorithms): The parameters used by the recommendation system and ways to influence them must also be presented in the GTCs.
  • Blocking of users who regularly provide obviously illegal content after prior warning.
  • Transparency of online advertising by labeling and indicating the (legal) person whose advertising is displayed. In addition, it must be stated who is paying for the advertising and why it is being displayed to the user. In addition, the use of dark pat-terns is not permitted, in which users are to be led to certain decisions through targeted design or suggestions (e. g. by visually pushing a certain choice to the fore). In particular, the rejection of coo-kies must not be more difficult than the approval.
  • Prohibition of advertising aimed specifically at children: Operators of online platforms should prevent such advertising through policies and / or technical set-ups.
  • Transparency reporting requirements: In addition to the general transparency reporting obligations, the number and implementation of disputes before the dispute resolution body, closures of user accounts, and the number of monthly active users must be reported every six months.
  • Additional obligations for online platforms for the conclusion of distance contracts between consumers and businesses (marketplaces): Verification of third-party authorizations (know-your-customer) and their traceability. B2C online marketplaces must collect and, if possible, verify contact and payment data as well as proof of identity and, if applicable, the registration number from third-party providers. If this is not successful, the entrepreneur is to be blocked / removed. Compliance by Design is intended to enable entrepreneurs to provide legally required consumer information prior to entering into a contract via the marketplace (e. g., product safety information). Furthermore, consumers may need to be informed about illegal goods and services.

In addition, online trading venues must take care not to give the impression that they themselves are contracting parties to the brokered transaction, as the liability privilege for content posted by users would then no longer apply.

IV. Very Large Providers (Art. 33 to 43 DSA)

For very large providers of digital intermediary services and search engines with more than 45 million monthly active users in the EU ("very large online platforms", VLOPs), there are additional comprehensive obligations. The reason for this is the considerable additional risks associated with use by very large numbers of people. However, it is estimated that only around 30 companies are currently covered.

  • Obligation for risk management and crisis response: In order to prevent misuse and systemic risks, very large platforms must carry out regular risk analyses. Systemic risks include the dissemination of illegal content, adverse effects on the exercise of certain fundamental rights, electoral processes or public safety, or serious adverse consequences for the physical and mental well-being of an individual. If necessary, the necessary measures shall be taken to mitigate the risk. Risk management systems must be reviewed annually by an independent body. In addition, an internal independently managed compliance department must be established. In the event of a crisis, very large providers may be required by the Commission to cooperate and take defensive measures.
  • Recommendation systems: Users must be given the opportunity to reject recommendations based on profiling by selecting alternative recommendation mechanisms.
  • Archiving of online advertising on very large online platforms and search engines for at least one year.
  • Data sharing with regulators and the research community to verify compliance with the DSA and to track the evolution of online risks.
  • Transparency reports must also include information on the people used to moderate content, their qualifications and language skills, and indicators of accuracy, as well as average monthly user numbers broken down by member state.
  • Codes of conduct: The Commission can develop codes of conduct (such as on online advertising and accessibility) to mitigate risks in the use of the platforms.

Large services must implement these obligations as early as four months after being designated as a very large service.

E. Conclusion & Outlook.

By February 17, 2023, all(!) online platforms must publish and report to the EU Commission the number of active end-users of their platform. Publication must take place every 6 months thereafter.  Operators of online marketplaces and social networks must therefore urgently check whether they are covered by the regulations of Art. 18 et seq. DSA and are obliged to report.

If, on this basis, the Commission comes to the conclusion that a very large platform is involved, it must then meet particularly strict requirements as early as four months after being designated as such and carry out the first of the risk assessments that must then be performed annually.

For all others, the obligations outlined above must be implemented within a good year, by February 17, 2024. Implementation should begin as soon as possible, as the introduction of reporting mechanisms or the adaptation of terms of use, for example, which affect almost all companies offering online intermediary services, can hardly be implemented in a legally secure manner without legal expertise.

We are happy to support you with our DSA compliance package, individualized terms of use for intermediary services or forms for reporting illegal content.

 

Download as PDF

Contact persons

You are currently using an outdated and no longer supported browser (Internet Explorer). To ensure the best user experience and save you from possible problems, we recommend that you use a more modern browser.