Paul McCartney, Elton John and 400 Other UK Creatives Demand Transparency on AI Model Training of Copyrighted Works

UK creative leaders, including Paul McCartney and Elton John, demand government mandate AI transparency on copyrighted training data to protect the industry from 'mass theft'.

Over 400 leading figures from the United Kingdom’s creative industries, including music legends Sir Paul McCartney and Sir Elton John, have sent a letter to Prime Minister Keir Strmer, demanding that AI companies be legally compelled to disclose the copyrighted works used in their training data.

This unified call from artists, writers, actors, and industry leaders highlights significant concerns within the creative sector about widespread copyright infringement and its potential to erode future income, threatening the UK’s status as a global creative force. The signatories are advocating for a specific amendment to the Data (Use and Access) Bill, asserting that transparency is crucial for holding AI firms accountable for what they term “mass theft” of creative content.

The letter, organized by the Creative Rights in AI Coalition, which includes major industry bodies like the British Phonographic Industry (BPI) and the News Media Association, argues that while UK copyright law is fundamentally sound, it cannot be enforced if creators are unaware of how their work is being used.

“Copyright law is not broken, but you can’t enforce the law if you can’t see the crime taking place,” the letter states. Transparency requirements, they contend, would make the risk of infringement too high for AI companies to continue operating outside the law.

Critics of the government’s current approach, which involves studying the feasibility of an ‘opt-out’ copyright regime and transparency measures, argue these steps are insufficient and leave creators vulnerable to ongoing theft.

“Government amendments requiring an economic impact assessment and reports on the feasibility of an ‘opt-out’ copyright regime and transparency requirements do not meet the moment, but simply leave creators open to years of copyright theft,” the letter states. This delay, some reports suggest, could mean final rules might not be published until 2029. The creative leaders warn that allowing their work to be freely exploited by a few powerful overseas tech companies risks sacrificing a significant growth opportunity and the UK’s cultural influence.

“We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies and with it our future income, the UK’s position as a creative powerhouse, and any hope that the technology of daily life will embody the values and laws of the United Kingdom.”

The Parliamentary Battle for Transparency

The push for mandatory transparency comes as the Data (Use and Access) Bill moves through Parliament. An amendment championed by Baroness Beeban Kidron, which would have created a marketplace for licensing copyrighted content and required transparency, was passed by the House of Lords earlier this year but subsequently removed in the House of Commons. Despite a Liberal Democrat MP reintroducing the amendment in the Commons, it was voted down on May 7, 2025.

Baroness Kidron emphasized that the direction of AI development and who benefits from it are critical questions of our time. She stressed the vital economic contribution of the UK’s creative industries, providing 2.4 million jobs, and argued they should not be sacrificed to the interests of a handful of US tech companies.

“They must not be sacrificed to the interests of a handful of US tech companies.” Transparency, she believes, is essential for the UK to become a global player in the AI supply chain and cultivate a thriving licensing market.

“The UK is in a unique position to take its place as a global player in the international AI supply chain, but to grasp that opportunity requires the transparency provided for in my amendments, which are essential to create a vibrant licensing market,” she added.

Cross-party support for the principle of transparency exists in the Lords. Lord Brennan of Canton, a Labour peer, warned against allowing mass copyright theft to damage the economy, stating “We cannot let mass copyright theft inflict damage on our economy for years to come,” suggesting transparency would unlock tremendous economic growth by positioning the UK as a premier market for high-quality AI training data.

Lord Black of Brentwood, a Conservative peer, criticized the government’s timeline, stating it would delay transparency provisions for years, arguing that “the Government amendments set us on a timeline that will not see any transparency provisions introduced until the very tail end of this Parliament at the earliest,” and that immediate transparency is necessary to protect property rights against Big Tech and stimulate the market.

Lord Clement-Jones, a Liberal Democrat spokesperson, added that transparency forms the foundation for a vibrant licensing system where creators are respected and compensated, fostering collaboration between creative and tech sectors.

Legal Challenges and Industry Responses

The UK debate unfolds against a backdrop of ongoing legal battles globally over AI training data. Major record labels, including Universal Music Group, Sony Music Entertainment, and Warner Records, have filed lawsuits against AI music generators like Suno and Udio, alleging widespread copyright infringement. The RIAA’s chief legal officer characterized these as clear instances of “unlicensed copying of sound recordings on a massive scale.”

Suno and Udio have defended their actions under the “fair use” doctrine, arguing their systems learn musical ideas to create new content rather than reproducing existing tracks. The RIAA, however, called the companies’ admission of likely using copyrighted recordings a “major concession”, arguing such large-scale use does not qualify as fair use. 

Beyond music itself, similar legal challenges are emerging. Anthropic settled a lawsuit with music publishers, agreeing to use “guardrails” to prevent its AI from generating copyrighted lyrics. However, the core issue of whether unlicensed training data use is fair use remains unsettled in that case.

Meta faces lawsuits over alleged use of pirated books, and The New York Times has sued OpenAI and Microsoft over the use of its articles. A recent hearing in Meta’s AI copyright case saw a judge signal that market harm, rather than data piracy itself, might be the key test for fair use. Court documents revealed concerns even among Meta engineers.

Alternative Approaches and Future Outlook

While legal disputes continue, some tech companies are pursuing licensing agreements. YouTube is negotiating with major record labels for rights to use music in AI training, and OpenAI has secured deals with various media outlets like TIME and Condé Nast for authorized  access of written content.

The debate also involves ethical considerations and legislative efforts like the NO FAKES Act in the US, aimed at protecting voice and likeness from unauthorized AI replicas. Some AI developers, such as NVIDIA, have withheld models like Fugatto from public release due to ethical concerns about potential misuse.

A recent opinion piece in Forbes argued that proposed amendments reveal a fundamental misunderstanding of AI developmentand would impose “crushing transparency obligations” that are technically unfeasible and would force companies to reveal intellectual property. The piece suggested that hasty changes should be avoided and the government’s consultation process should conclude.

Many argue that existing “gold-standard copyright laws” are sufficient, but transparency and enforcement are lacking. The “Make it Fair” campaign was launched to fight against creative content being given away “for free” to AI firms, arguing the impact on creative businesses would be devastating if theft continues unchecked.

Award-winning producer Giles Martin stated that if Sir Paul McCartney were to write ‘Yesterday’ today, it should belong to him, and he should control the use of his voice, arguing the government is not doing enough to protect artists. He added, “If you make something, if something is yours, it shouldn’t be taken by a company and used without your permission. It’s as simple as that.”

The government maintains its objective is to deliver increased control and transparency for rights holders while ensuring access to high-quality material for training leading AI models in the UK. However, some argue the government seems more influenced by large technology companies. An ongoing public consultation on AI and copyright has received over 11,500 submissions, indicating the high level of interest and concern surrounding these issues.

Markus Kasanmascheff
Markus Kasanmascheff
Markus has been covering the tech industry for more than 15 years. He is holding a Master´s degree in International Economics and is the founder and managing editor of Winbuzzer.com.

Recent News

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments
0
We would love to hear your opinion! Please comment below.x
()
x