Home

Donate
Perspective

The UK Struggles to Balance AI Innovation and Creative Protection

Audrey Hingle / Jun 26, 2025

Audrey Hingle is the Editor-in-Chief of The Internet Exchange.

Perhaps nowhere is the tension between content creators and generative AI technologies playing out more acutely than in the United Kingdom. Home to a world-renowned creative sector that contributes £126 billion per year to the economy — roughly 5.7% of GDP — and supports 2.4 million jobs, the UK has long punched above its weight in music, publishing, design, film, advertising, and gaming. These industries rely on robust intellectual property protections and human creativity, making them especially vulnerable to AI systems that are trained on vast amounts of copyrighted content, often without permission or compensation.

At the same time, the UK sees leadership in AI as essential to its economic future. The tech sector contributes over £150 billion to the economy and employs 1.7 million people. Policymakers are positioning AI as a cornerstone of digital innovation, foreign investment, and economic growth. But the UK faces a strategic bind: it is difficult to compete with the scale and dominance of US and Chinese tech giants. This puts added pressure on the government to make the UK an attractive destination for AI development. The recent passage of the Data Use and Access Bill without an artist-backed amendment requiring disclosure of copyrighted training data and the now-completed consultation on AI and copyright highlight the ongoing challenge of balancing tech-sector growth with protections for the creative industries.

Recent policy flashpoints

The Data Use and Access Bill (June 2025)

The Data Use and Access Bill was introduced to modernize the UK’s data infrastructure and encourage innovation across sectors, including AI. During its passage through Parliament, an amendment championed by artists and Baroness Beeban Kidron proposed requiring AI developers to disclose whether copyrighted materials were used in training datasets. Supporters saw this as a basic measure of transparency and accountability. Opponents, including industry lobbyists, argued it would deter AI investment by creating regulatory uncertainty and compliance burdens. Sir Nick Clegg, former president of global affairs at Meta, argued that asking permission from all copyright holders would "kill the AI industry in this country." The amendment was ultimately excluded from the final legislation, a decision that many in the creative sector interpreted as the government siding with tech firms over creators.

The Copyright and AI Consultation (Dec 2024–May 2025)

In parallel, the UK government conducted a consultation to clarify how copyright law applies to AI training. The government consultation proposes allowing AI models to be trained on copyrighted material unless its individual owners choose to opt out. Developers must offer transparency and creators have a way to opt out or license their work:

The consultation also proposes new requirements for AI model developers to be more transparent about their model training datasets and how they are obtained. For example, AI developers could be required to provide more information about what content they have used to train their models. This would enable rights holders to understand when and how their content has been used in training AI.

Many creators argued that this placed the burden of enforcement on individuals, rather than establishing a meaningful consent framework. The consultation attracted widespread public interest, including a silent protest album released by more than 1,000 musicians as a symbolic rejection of the proposals. While the government framed the consultation as an effort to balance competing needs, critics saw it as further evidence that AI growth is being prioritized over creative rights.

What makes this so difficult: competing pressures and real trade-Offs

The UK is caught between two strategic priorities that increasingly pull in opposite directions. On one side is the pressure to stay competitive in a global AI race, where firms choose jurisdictions based on regulatory clarity, access to data, and innovation-friendly policies. On the other is a creative sector that generates significant economic value and global cultural influence, but whose future depends on enforceable rights and control over how creative work is used.

Copyright law was not built for machine learning. Terms like “fair use,” “opt-out,” and “transparency” lack consensus and are hard to enforce across borders. With the EU and US pursuing diverging approaches, the UK lacks a clear model to follow. Decisions made now will set precedents not just for who benefits from AI, but for whose work is valued and protected.

Where this might go next

The government is expected to respond to the consultation later this year, a decision that could shape future regulation or guidance around AI training data. Some experts are calling for an opt-in system that centers consent, rather than the burden of opting out. Others (most vocally Nick Clegg) warn that anything more restrictive than the current approach could stifle AI innovation in the UK.

The UK has a decision to make: whether to prioritize tech growth at all costs, or to build a more balanced framework that values creativity as much as computation. This could be a chance to lead internationally by developing a workable licensing framework that protects rights holders and provides legal certainty for developers. It will take policy creativity to move beyond the binary of Big Tech vs Big Content, but getting it right could set a meaningful precedent for the future.

Authors

Audrey Hingle
Audrey Hingle is the Editor-in-Chief of The Internet Exchange. Based in London, she is passionate about how people interact with technology and the internet, and how it can influence our lives for better (or worse). Audrey has over 15 years of experience working in strategic communications, and prev...

Related

The AI Gambit: Will the UK Lead or Follow? October 23, 2024
Analysis
The UK’s AI Strategy Risks Entrenching the Power of Big TechMay 29, 2025
Perspective
The ‘AI for Good’ Agenda: For Whose Benefit?June 24, 2025

Topics