03-15-2024Article

Update Data Protection No. 173

EU digital law: Formal agreement in the EU Parliament on the AI Regulation, the Cyber Resilience Act and the European Media Freedom Act

The EU is making progress with the implementation of the new EU digital law, in particular on the regulation of artificial intelligence, the IT security of connected products and media freedom. This week, the EU Parliament reached an agreement on three important regulations: On the one hand, formal approval was given to the regulation on the regulation of artificial intelligence ("AI Act" we reported in the Data Protection Updates No. 168, 162, 146 and 121) and the Cyber Resilience Act ("CRA" we reported in the Data Protection Updates No. 160, 132 and 118) as well as the complete replacement of the Product Liability Directive. On the other hand, the EU Parliament approved the European Media Freedom Act ("EMFA") to safeguard press freedom and the independence of media service providers in the EU.

A. AI Act

The content of the world's first law on the regulation of artificial intelligence, which is based on a proposal by the EU Commission from 2021, was debated until the very end. The question of a ban on real-time biometric surveillance in particular was at the heart of numerous discussions during the legislative process. Now, in principle, real-time biometric remote identification in public spaces without cause remains prohibited, as do systems for social scoring or emotion recognition of employees, but there are numerous exceptions, such as for the "foreseeable risk of a criminal offense", and retrospective biometric surveillance is also possible. In addition, AI for facial recognition may continue to be used, at least for border controls, and the transparency obligations of the regulation are not intended to apply to the sensitive – but less relevant for private companies – areas of law enforcement, migration, border control or asylum.

However, the member states are allowed to issue stricter regulations that go beyond the AI Act. Germany, for example, has already announced a ban on real-time biometric surveillance, while the FDP even wants to prevent retrospective biometric surveillance, for example for law enforcement purposes. The use of AI for emotion recognition is also viewed rather critically by the parties in the German federal government, the so-called Ampel-Regierung. In this respect, it remains to be seen whether the catalog of banned AI systems for Germany will be expanded.

The regulation of general purpose AI (GPAI), such as ChatGPT, has also been the subject of debate. Despite the efforts of many lobby associations to the contrary, this has remained the subject of special requirements, for example with regard to transparency or the quality of the data used.

Overall, the differences compared to the agreement reached in the trilogue in December, which was also formally approved by the representatives of the EU member states in February, are nevertheless smaller than expected, meaning that companies can build on efforts already underway to implement the regulation. All others should take action now at the latest in order to avoid fines of up to EUR 35 million or 7 % of the previous year's global turnover.

B. Cyber Resilience Act (CRA)

The CRA is aimed at manufacturers, importers and distributors of connected products such as IoT devices, laptops, microchips or smartwatches and the software associated with them. Manufacturers of such so-called "products with digital elements" should be obliged to ensure the IT security of these products throughout their entire life cycle. Importers and distributors will also be subject to obligations under product safety law that are already familiar in principle, but which are completely new, particularly with regard to IT security, and will therefore also pose new practical challenges. There are hardly any changes compared to the political agreement on the CRA in December, meaning that the scope of the obligations to be imposed on the economic operators involved is already largely known. A central provision of the CRA is the privileged treatment of open source software. However, the scope of this exemption is likely to require detailed examination in many cases due to its wording and the fact that it applies when "no commercial purposes are pursued".

Companies should also pay particular attention to whether the products they manufacture, import or distribute with digital elements are so-called critical and important products, which are subject to different requirements for conformity assessment procedures involving so-called notified bodies depending on their classification in Class I or Class II. The lists of these important and critical products are regularly reviewed by the EU Commission and updated if necessary, meaning that the requirements for individual products can change at short notice without the need for costly amendments to the text of the regulation.

The CRA also strengthens the role of the European Cybersecurity Agency, ENISA, which is to be more closely involved, particularly in the event of vulnerabilities and security incidents.

C. EMFA

Parliament also voted on the European Media Freedom Act, which aims to protect the independence of media service providers by obliging them to disclose their ownership structure and other measures to protect editorial independence.

However, not only audiovisual media services such as television and radio are affected, but also high-reach YouTube channels and other video channels on social media. However, the main focus of the regulation is on extended powers for the authorities.

D. What happens next?

All three regulations will shortly be formally approved by the Council and published in the European Official Journal and will enter into force after 20 days.

From this date, companies will have two years (previously 36 months were under discussion) to implement all the requirements of the AI Act. By way of derogation, only a 12-month implementation period applies to GPAI, while the rules for AI systems embedded in regulated products only apply after 36 months.

The implementation steps for the AI Act can be roughly divided into three categories, which you can find more details on in our previous data protection updates:

  1. Risk analysis: Are AI systems used or even produced in the company? Is it high-risk AI or AI systems for interacting with natural persons, deepfakes, AI-generated content or systems for biometric categorization or emotion recognition?
  2. Governance: define responsibilities, ensure human oversight of AI systems; manufacturers of high-risk AI: carry out EU conformity procedures, set up quality management and risk management systems, carry out regular tests (etc.)
  3. Information & transparency obligations: Affixing the CE marking, keeping the EU Declaration of Conformity, instructions for use and technical documentation available, labeling AI that interacts with natural persons like chatbots, deepfakes and AI-generated content as well as biometric categorization or emotion recognition systems (among others)

Companies still have 36 months to implement the CRA from the date it comes into force, i. e. until mid-2027, until all requirements must be implemented – a manageable timeframe given the sometimes long product development cycles.

Implementation could be based on the following implementation steps:

  1. Identification: Which offered and planned products contain digital elements? Are they critical and important Class I and II products (check regularly against the Commission's lists)? (among others)
  2. Compliance: How should conformity assessment be carried out? Affix CE marking, provide information and recommendations for action and develop test procedures for product monitoring (among other things)
  3. Maintain product security: Provide security updates (separate from functional updates) throughout the lifecycle (typically five years); report exploited vulnerabilities and security incidents to national authorities and fulfill other reporting requirements (among others) as appropriate.

The EMFA is generally valid for 15 months after its entry into force. However, the obligation of media service providers to take measures to ensure editorial independence already applies after nine months. The media companies concerned must therefore quickly consider what measures are necessary to ensure this and then document these measures.

Download as PDF

Contact persons

You are currently using an outdated and no longer supported browser (Internet Explorer). To ensure the best user experience and save you from possible problems, we recommend that you use a more modern browser.