“Thievery on a high scale.”
That’s how Sir Elton John described the government’s apparent willingness to let tech giants mine artists’ work to feed AI programs — without permission, payment, or transparency.
He’s not wrong.
This month, the House of Lords voted overwhelmingly in favour of amending the Data Bill to protect creators’ rights in the age of artificial intelligence.
Days later, the House of Commons rejected it.
Let’s be clear: this isn’t just a legislative hiccup.
It’s a cultural fault line.
The government’s decision to resist transparency for AI firms isn’t a neutral act — it’s a message to artists, writers, and musicians that their work can be quietly harvested in the name of innovation.
But this isn’t inevitable.
Nor is it the end of creativity.
It’s a moment to pause — and choose a better path forward.
What’s really going on?
Generative AI models — like those used to write articles, create music, or generate visual art — are trained on unimaginably large datasets, often pulled from publicly available Internet content.
That sounds harmless until you realise that “publicly available” is not the same as “public domain.”
If a song is posted online, it’s not free for repurposing — any more than hanging a Picasso in a window makes it public property.
The current legal grey area allows tech companies to hoover up everything they can access and say, “we’re not copying — we’re training.”
But when the result is AI-generated work that mimics a singer’s voice or borrows a playwright’s tone, the line between learning and theft gets very blurry.
A real threat to real people
If you’re Elton John or Paul McCartney, you might have the means to fight back.
But if you’re a 23-year-old singer just releasing your debut EP? You don’t.
You may never even know your style has been used to train a system that can now generate “you-ish” music on demand — owned by a corporation, credited to no one.
This isn’t fearmongering.
It’s already happening.
And it risks hollowing out the economic model of creative work — the very thing that allows artists to dedicate time, care, and craft to their work in the first place.
We don’t have to choose between art and AI
What’s missing from the government’s current approach is nuance.
This isn’t a battle between Luddites and innovators.
It’s about regulation that ensures technology evolves in partnership with — not at the expense of — the humans it depends on.
We need to stop framing AI development as a zero-sum game.
The real question is: can AI and creativity thrive together without one cannibalising the other?
The answer is yes — but not without new rules.
Five steps towards a fair future
Let’s turn this moment of anger into a plan of action. Here’s how we begin:
-
Transparency Requirements: Companies must disclose what data is used to train their models. Secret datasets undermine trust.
-
Opt-In Rights for Creators: Artists should give explicit permission before their work is used in training — just as they would for licensing or sampling.
-
Royalties for Training Use: If your song, book, or script helps power an AI model, you deserve compensation — just as with streaming or broadcast rights.
-
AI Disclosure Labelling: Content generated by AI should be clearly labeled as such — not hidden, not passed off as human-made.
-
A Creative AI Standards Council: Bring together creators, technologists, ethicists, and lawmakers to set transparent rules — before the tech gets too far ahead.
Artists aren’t anti-AI — they’re anti-exploitation
Plenty of artists are curious about collaborating with AI.
There are incredible opportunities here — but they must be built on consent, credit, and compensation.
To frame this as a fight between “the future” and “the past” is disingenuous.
Art is the future.
And unless we protect the conditions that make it possible, we’re not building a future at all — we’re just burning the past to feed the algorithm.
A call to the government — and to us all
Britain’s creative industries are world-class.
They’re also under threat.
The government must stop cosying up to Silicon Valley and start standing up for British talent — young and old.
That means real legislation, not vague consultations.
AI should not be an excuse to gut the cultural commons.
If we get this right, we can lead the world in ethical, human-centred innovation.
If we get it wrong, we’ll sell out the soul of our culture for a quick technological fix.
Now is the moment to decide.
Declaration of Interest: This article includes insights provided by OpenAI’s ChatGPT, a generative AI system. The model itself was trained on publicly available data, some of which may include creative works under copyright, raising precisely the issues explored in this piece.
Like this:
Like Loading...
Not the end of creativity — but how can we protect artists in the Age of AI
That’s how Sir Elton John described the government’s apparent willingness to let tech giants mine artists’ work to feed AI programs — without permission, payment, or transparency.
He’s not wrong.
This month, the House of Lords voted overwhelmingly in favour of amending the Data Bill to protect creators’ rights in the age of artificial intelligence.
Days later, the House of Commons rejected it.
Let’s be clear: this isn’t just a legislative hiccup.
It’s a cultural fault line.
The government’s decision to resist transparency for AI firms isn’t a neutral act — it’s a message to artists, writers, and musicians that their work can be quietly harvested in the name of innovation.
But this isn’t inevitable.
Nor is it the end of creativity.
It’s a moment to pause — and choose a better path forward.
What’s really going on?
Generative AI models — like those used to write articles, create music, or generate visual art — are trained on unimaginably large datasets, often pulled from publicly available Internet content.
That sounds harmless until you realise that “publicly available” is not the same as “public domain.”
If a song is posted online, it’s not free for repurposing — any more than hanging a Picasso in a window makes it public property.
The current legal grey area allows tech companies to hoover up everything they can access and say, “we’re not copying — we’re training.”
But when the result is AI-generated work that mimics a singer’s voice or borrows a playwright’s tone, the line between learning and theft gets very blurry.
A real threat to real people
If you’re Elton John or Paul McCartney, you might have the means to fight back.
But if you’re a 23-year-old singer just releasing your debut EP? You don’t.
You may never even know your style has been used to train a system that can now generate “you-ish” music on demand — owned by a corporation, credited to no one.
This isn’t fearmongering.
It’s already happening.
And it risks hollowing out the economic model of creative work — the very thing that allows artists to dedicate time, care, and craft to their work in the first place.
We don’t have to choose between art and AI
What’s missing from the government’s current approach is nuance.
This isn’t a battle between Luddites and innovators.
It’s about regulation that ensures technology evolves in partnership with — not at the expense of — the humans it depends on.
We need to stop framing AI development as a zero-sum game.
The real question is: can AI and creativity thrive together without one cannibalising the other?
The answer is yes — but not without new rules.
Five steps towards a fair future
Let’s turn this moment of anger into a plan of action. Here’s how we begin:
Transparency Requirements: Companies must disclose what data is used to train their models. Secret datasets undermine trust.
Opt-In Rights for Creators: Artists should give explicit permission before their work is used in training — just as they would for licensing or sampling.
Royalties for Training Use: If your song, book, or script helps power an AI model, you deserve compensation — just as with streaming or broadcast rights.
AI Disclosure Labelling: Content generated by AI should be clearly labeled as such — not hidden, not passed off as human-made.
A Creative AI Standards Council: Bring together creators, technologists, ethicists, and lawmakers to set transparent rules — before the tech gets too far ahead.
Artists aren’t anti-AI — they’re anti-exploitation
Plenty of artists are curious about collaborating with AI.
There are incredible opportunities here — but they must be built on consent, credit, and compensation.
To frame this as a fight between “the future” and “the past” is disingenuous.
Art is the future.
And unless we protect the conditions that make it possible, we’re not building a future at all — we’re just burning the past to feed the algorithm.
A call to the government — and to us all
Britain’s creative industries are world-class.
They’re also under threat.
The government must stop cosying up to Silicon Valley and start standing up for British talent — young and old.
That means real legislation, not vague consultations.
AI should not be an excuse to gut the cultural commons.
If we get this right, we can lead the world in ethical, human-centred innovation.
If we get it wrong, we’ll sell out the soul of our culture for a quick technological fix.
Now is the moment to decide.
Declaration of Interest: This article includes insights provided by OpenAI’s ChatGPT, a generative AI system. The model itself was trained on publicly available data, some of which may include creative works under copyright, raising precisely the issues explored in this piece.
Like this:
you might also like
New EU copyright law could mean a clampdown on social media images
Like this:
The taxman has 55 BILLION items of our data from social media spying. What about data protection?
Like this:
Labour links up with the Tories to betray democracy and make UK a police state
Like this:
Like this: