There are decades when nothing happens (as Lenin is – wrongly – supposed to have said) and weeks when decades happen. We’ve just lived through a few weeks like that. We’ve known for decades that some American tech companies were problematic for democracy because they were fragmenting the public sphere and fostering polarisation. They were a worrying nuisance, to be sure, but not central to the polity.
And then, suddenly, those corporations were inextricably bound into government, and their narrow sectional interests became the national interest of the US. Which means that any foreign government with ideas about regulating, say, hate speech on X, may have to deal with the intemperate wrath of Donald Trump or the more coherent abuse of JD Vance.
The panic that this has induced in Europe is a sight to behold. Everywhere you look, political leaders are frantically trying to find ways of “aligning” with the new regime in Washington. Here in the UK, the Starmer team has been dutifully doing its obeisance bit. First off, it decided to rename Rishi Sunak’s AI Safety Institute as the AI Security Institute, thereby “shifting the UK’s focus on artificial intelligence towards security cooperation rather than a ‘woke’ emphasis on safety concerns”, as the Financial Times put it.
But, in a way, that’s just a rebranding exercise – sending a virtue signal to Washington. Coming down the line, though, is something much more consequential; namely, pressure to amend the UK’s copyright laws to make it easier for predominantly American tech companies to train their AI models on other people’s creative work without permission, acknowledgment or payment. This stems from recommendation 24 of the AI Opportunities Action Plan, a hymn sheet written for the prime minister by a fashionable tech bro with extensive interests (declared, naturally) in the tech industry. I am told by a senior civil servant that this screed now has the status of holy writ within Whitehall. To which my response was, I’m ashamed to say, unprintable in a family newspaper.
The recommendation in question calls for “reform of the UK text and data-mining regime”. This is based on a breathtaking assertion that: “The current uncertainty around intellectual property (IP) is hindering innovation and undermining our broader ambitions for AI, as well as the growth of our creative industries.” As I pointed out a few weeks ago, representatives of these industries were mightily pissed off by this piece of gaslighting. No such uncertainty exists, they say. “UK copyright law does not allow text and data mining for commercial purposes without a licence,” says the Creative Rights in AI Coalition. “The only uncertainty is around who has been using the UK’s creative crown jewels as training material without permission and how they got hold of it.”
As an engineer who has sometimes thought of IP law as a rabbit hole masquerading as a profession, I am in no position to assess the rights and wrongs of this disagreement. But I have academic colleagues who are, and last week they published a landmark briefing paper, concluding: “The unregulated use of generative AI in the UK economy will not necessarily lead to economic growth, and risks damaging the UK’s thriving creative sector.”
And it is a thriving sector. In fact, it’s one of the really distinctive assets of this country. The report says that the creative industries contributed approximately £124.6bn, or 5.7%, to the UK’s economy in 2022, and that for decades it has been growing faster than the wider economy (not that this would be difficult). “Through world-famous brands and production capabilities,” the report continues, “the impact of these industries on Britain’s cultural reach and soft power is immeasurable.” Just to take one sub-sector of the industry, the UK video games industry is the largest in Europe.
There are three morals to this story. The first is that the stakes here are high: get it wrong and we kiss goodbye to one of “global” Britain’s most vibrant industries. The aim of public policy should be building a copyright regime that respects creative workers and engenders the confidence that AI can be fairly deployed to the benefit of all rather than just tech corporations. It’s not just about “growth”, in other words.
The second is that any changes to UK IP law in response to the arrival of AI need to be carefully researched and thought through, and not implemented on the whims of tech bros or of ministers anxious to “align” the UK with the oligarchs now running the show in Washington.
The third comes from watching Elon Musk’s goons mess with complex systems that they don’t think they need to understand: never entrust a delicate clock to a monkey. Even if he is as rich as Croesus.
after newsletter promotion
What I’ve been reading
The man who would be king
Trump As Sovereign Decisionist is a perceptive guide by Nathan Gardels to how the world has suddenly changed.
Technical support
Tim O’Reilly’s The End of Programming As We Know It is a really knowledgable summary of AI and software development.
Computer says yes
The most thoughtful essay I’ve come across on the potential upsides of AI by a real expert is Machines of Loving Grace by Dario Amodei.