EU AI Act explained: What does it mean for music producers and artists?

Thinking of using AI tools in your music? Here’s what you need to know.

When you purchase through affiliate links on MusicTech.com, you may contribute to our site through commissions. Learn more
AI logo on a smartphone against the EU Flag, photo by Omar Marques/SOPA Images/LightRocket via Getty Images

Image: Omar Marques/SOPA Images/LightRocket via Getty Images

Jonathan Coote is a music and AI lawyer at Bray & Krais.

AI music tools might be controversial right now, but almost two-thirds of young creatives are embracing AI in their music making, according to a recent study. While AI music generation software has led to innovative creative opportunities, the legal landscape is still hotly contested with new disputes arising almost every week.

On May 21, 2024, the EU Council approved the EU AI Act, which could have a significant global impact on the music industry. Below, we set out some of the key legal questions raised by AI for the music industry and how different jurisdictions are dealing with them, before flagging the biggest risks from a legal perspective when using AI to make music, particularly for music producers and engineers.

There are three main legal questions raised by the use of AI in the music industry:

Producer working in a music studio, photo by wundervisuals via Getty Images
Image: wundervisuals via Getty Images

Train in Vain?

There is a huge public debate at the moment about the use of music (and other content) in the training and development of AI tools. Put simply, most AI-generated music tools train and learn from existing human-made music. While some have secured licences to do so, there are various lawsuits against AI companies, alleging that they have trained their tools without getting a licence first. To make the matter more complex, the laws on this topic are different around the world.

In the EU, copyright law restricts training on copyright works for commercial purposes without a licence if rightsholders have simply ‘opted out’. Under the new EU AI Act, providers of so-called ‘general purpose’ AI models on the EU market will not only have to abide by these rules but must also demonstrate their compliance by providing transparency on their training process, seemingly even if the training was conducted outside the EU. This could have profound implications for AI companies worldwide – many of whom have trained their tools in countries with more lenient rules, such as the US.

In the UK, the government decided not to bring in the EU’s ‘opt-out’ approach post-Brexit, so the law currently appears to prevent all training on copyright works for commercial purposes without a licence. However, there is a major case going through the English courts at the moment (Getty Images v Stability AI), which would be the first time the law is actually tested in practice.

Before that reaches trial, though, the law may have already changed.

The UK government has been unable to update the law, and its attempts to set up a voluntary system led to failed talks between the tech industry and rightsholders. Prior to the calling of the general election, Culture Secretary Lucy Frazer suggested that change would be coming, although it now remains to be seen whether either of the main parties will commit to regulating AI should they form the next government.

In the US, there is no specific law on this issue and so there is even more uncertainty. Perhaps unsurprisingly, there are currently a large number of lawsuits on this issue involving AI tool providers such as OpenAI, Stability AI and Anthropic.

Producer working on music, photo by M Stock via Getty Images
Image: M Stock via Getty Images

Fake it till you … take it down

Deepfakes and voice clones have taken the internet by storm in the past 12 months. However, this is a complex area of law with no specific right to protect artists and celebrities in most countries (unless used as part of a false endorsement). So far, online takedowns have been successful but the process isn’t as easy or straightforward as the music industry would like.

The EU AI Act is set to introduce a transparency obligation which requires deepfakes to be identified, including voice clones. This won’t stop their circulation entirely, but it should at least mean that consumers can clearly spot a deepfake when they happen across one.. However, there remain questions about the impact on more creative or parodic uses. There is therefore still a gap for a specific ‘digital representation’ right, as has been considered in the US and proposed by UK Music and the All-Party Parliamentary Group on Music in the UK.

Do androids dream of electric beats?

You may be surprised to learn that music created with AI tools might not actually be protected by copyright at all. This has been the position taken so far by the US Copyright Office, and is probably similar in the UK based on the current law — although there’s the added complication that, in the UK, sound recordings appear to be protected regardless of whether they are made by a human or an AI tool.

Though the output of an AI tool based on a minimal prompt may not be protected, if this is then edited or modified (e.g. AI-generated chords to which you add a melody and lyrics), it is likely that the final version will be protected.

Producer working on music in a studio, photo by Drablenkov via Getty Images
Image: Drablenkov via Getty Images

What to think about when using AI tools in your music creation

  • Music created purely with AI from a basic prompt may not be protected by copyright. This is important as you may be asked to warrant in agreements with other parts of the industry (e.g. record labels, publishers or distributors) that you own all the rights to the music. To try and get some level of copyright protection, you should adapt and modify what the AI produced in a creative way.
  • Check to see what rights you have to use anything you create using an AI tool. Do you own it or do you license it? Are you limited to “non-commercial” use, unless you pay a fee? If so, this probably means that you can’t upload it to a streaming service, for example.
  • Avoid uploading or distributing unlicensed voice clones and deepfakes of real people — unless you would like a letter from their lawyers!
  • Many producers are increasingly using separation tools to extract stems to create new mixes or to find new samples. Remember that in almost every case, you still need a licence to use a sample (even if you’ve extracted the stem from an existing mix).
logo

Get the latest news, reviews and tutorials to your inbox.

Subscribe
Join Our Mailing List & Get Exclusive DealsSign Up Now
logo

The world’s leading media brand at the intersection of music and technology.

© 2024 MusicTech is part of NME Networks.