UK music industry needs urgent guidance on AI training rules

Partner Simon Goodbody and Associate Jonathan Coote have written for The Times, calling for clarity in the government’s policy on training AI models on copyrighted works.

Simon and Jonathan’s article was published in The Times, 28 March 2024, and can be found here.

On 13 March, the world’s first major AI regulation was passed (resoundingly) in the EU. The UK, however, is stuck between a pro-innovation and pro-rightsholder approach, leaving it seemingly unable to implement effective regulatory control. In limbo, the music industry, which generated £6.7bn for the UK economy in 2023, desperately needs clarity on the law relating to the training of AI tools on copyright works.

Although untested, the current statutory framework in the UK suggests that the training of AI tools on copyright works for commercial purposes is prohibited. However, in 2022, the government controversially announced plans to provide a complete exception for training. This about-turn would have created one of the world’s most lenient regimes. Despite aligning with the UK’s pro-tech National AI Strategy, the proposed exception was soon withdrawn after being described as “music laundering” by rightsholders.

The government then established a working group to develop a system for AI companies to obtain a “fair licence”. Perhaps unsurprisingly, given the diametrically opposed interests of tech and rightsholders, these talks failed early this year. So, it seems, we are back at square one.

In the absence of statutory clarification, tech companies and rightsholders are heading to court on both sides of the Atlantic. Most notably in England, stock-image site Getty Images has sued Stability AI for training its models on Getty’s massive image library. Whilst the UK’s prohibition on the training of AI tools on copyright works seems clear, establishing that infringing activity has taken place is complex. For example, proving that infringing acts occurred in the UK is challenging and, as Stability AI’s training data is not public, Getty had to reverse-engineer prompts to demonstrate that its works were used (sometimes with comical results). Stability AI has failed to strike-out the case, but the strength of Getty’s arguments is uncertain.

We are now in a race for legislation to arrive before judgment is handed down. Without a change in the law, the case could set an unhelpful precedent based not on the balance of creative and economic interests, but on its particular facts and law written in a time before large language models and generative AI.

The music industry is especially concerned. Contrary to expectation, there is no specific right to prevent the use of voice clones of artists in the UK; and AI-generated music is already undercutting human-made music when it is mass uploaded to streaming platforms. The most effective way to tackle these issues is for rightsholders to control training of their works and get paid.

There is progress, with an All-Party Parliamentary Group investigating and a private member’s bill in the Lords, but the government urgently needs to act to inject some clarity to the rights position.

RELATED

Dorothea Thompson on trade mark tensions over THE SMITHS name and brand, with some wider TM considerations

We are pleased to share Partner Dorothea Thompson’s article for Complete Music Update (CMU) discussing a recent public tussle between Morrissey and Johnny Marr tussle over THE SMITHS name ...

Read more >

Congratulations, Dorothea Thompson

Bray & Krais are proud to announce the promotion of our colleague Dorothea Thompson to the role of Partner – and Head of our market-leading Branding & Trade Marks Department. Dorot...

Read more >

Jonathan Coote interviewed by City A.M.’s Eyes On The Law

Associate Jonathan Coote was recently interviewed by City A.M. for their newly launched weekly legal segment Eyes On The Law, exploring the continuing legal uncertainty surrounding AI, music...

Read more >
< BACK TO INSIGHTS