The New AI Act and its Implications on Copyright

On 13 March 2024, the European Parliament adopted the AI Act by a large majority (523 votes in favour, 46 against and 49 abstentions) following the conclusion of the trilogue negotiations. The Council still has to formally adopt the AI Act. It will then be published in the EU’s Official Journal. It will enter into force 20 days after publication. In general, the AI Act will apply 24 months after its entry into force. Some provisions will apply earlier, others later. The consolidated version of the AI Act is available here.

The AI Act has already been the subject of controversial debate during the legislative process. An important issue is copyright protection. AI models are trained with large amounts of copyrighted- content (text, images, music, videos, etc.). Many rights holders fear that copyright protection will be undermined if AI companies obtain content from the internet without seeking permission from rights holders for permission and paying for its use. It has therefore been eagerly awaited what the AI Act will say about copyright. Here is a first overview:

Obligations of providers of General-Purpose AI Models

Language models such as GPT-4 or Claude-3 are so-called General-Purpose AI Models. They are characterised by the fact that they display a significant generality and can be integrated into a variety of downstream systems or applications (Article. 3 No. 63). Downstream applications in this sense are, for example, chatbots such as ChatGPT.

Providers of General-Purpose AI Models have the following obligations, among others:

  • They must draw up technical documentation for their model (Art. 53 para. 1 lit. a). This documentation must contain the details of the training and testing procedure listed in Annex XI of the AI Ordinance and the results of its evaluation. It must be made available both to the AI Office and the national authorities upon request. The Commission will adopt delegated acts on its own authority to amend the annexes in order to adapt the requirements to technical progress.
  • 53 para. 1 lit. b) also provides for a transparency obligation that enables other providers to integrate the model into their AI application in a legally compliant manner. The minimum requirements for the data to be provided are set out in Annex VII. Providers of low-risk open source applications are exempted from the aforementioned documentation and transparency obligations in accordance with Article 53 para. 2.
  • Pursuant to Article 53 para. 1 lit. c), all providers are obliged to develop a strategy to ensure compliance with EU copyright law. In particular, the identification of and compliance with reservations of rights pursuant to Article 4 (3) of the DSM Directive (EU Directive 2019/790, implemented in Germany in Section 44b UrhG) must be ensured. By referring to the DSM Directive, the AI Act makes it clear that the legislator apparently assumes that the text and data mining exception applies to the training of generative AI models. The AI Office will monitor compliance without, however, assessing copyright infringements on a work-by-work basis (recital 108).
  • 53 para. 1 lit. d) also requires the publication of a detailed summary of the content used for the training. This is intended to make it easier for rightholders to exercise and enforce their rights. The AI Office will prepare a template. Pursuant to Art. 54 (2) (a), providers established in third countries must appoint an authorised representative established in the EU to fulfil the above obligations.
  • The Commission has delegated the enforcement of the obligations to the AI Office (Art. 88 (1)). In the event of an infringement, providers may be fined up to 3% of their total annual worldwide turnover in the preceding financial year or EUR 15,000,000 (Art. 101 (1)).

Supervision

Even before the adoption of the AI Act, the Commission decided to establish the European Office for Artificial Intelligence on 24 January 2024. The Office, which is part of the Directorate-General for Communication Networks, Content and Technologies, will play a central role in the implementation as part of the governance architecture. Around 100 posts are currently being recruited.

Member States are also required to designate or establish other national regulatory authorities within 12 months of entry into force (Article 70). In Germany, the Federal Network Agency (BNetzA) and the Federal Office for Information Security (BSI) are under discussion, in addition to the federal and state data protection authorities.

The supervisory structure will be complemented by a European Artificial Intelligence Board. The Board shall be composed of one representative per Member State. With regard to the above-mentioned providers, it is to advise the Commission and the Member States on the enforcement of the AI Act (Art. 66 lit. c)).

Open questions

As a result, the AI Act contains only a few provisions on copyright – it is primarily a set of rules on product safety law. Recital 108 clarifies that the enforcement of other copyright provisions of Union law remains unaffected. However, it is unclear how this relates to the principle of territoriality in copyright law. According to the principle of territoriality, the copyright law of the country where the copyright-relevant act takes place applies. If the training of an AI model takes place outside the EU, e.g. in the US, foreign law would then apply. Irrespective of these copyright criteria, however, the AI Act obliges providers to implement a compliance strategy that also addresses the handling of legal reservations – under threat of fines. It remains to be seen how this will be resolved. Recital 106 could imply that, contrary to the principle of territoriality, providers may be subject to the European legal regime of the DSM Directive and must respect reservations of use, even if they are not required under the relevant foreign law.

(28 March 2024)