UK looks at forcing greater transparency on AI training models
Stay informed with free updates
Simply sign up to the Artificial intelligence myFT Digest — delivered directly to your inbox.
Tech companies in the UK face being forced to open up their artificial intelligence models to greater scrutiny in an attempt to help the creative industries stop work being ripped off or reproduced without compensation.
In a consultation announced on Tuesday, the UK government will offer an exemption to copyright laws, letting tech companies use material ranging from music and books to media and photos to train their AI models unless the rights holder objects under a so-called “rights reservation” system.
The plans to open up copyrighted material for training purposes are likely to anger many in the creative industries, with executives warning that the UK is at risk of undermining one of the country’s largest and most successful drivers of economic growth.
Having in effect to “opt out” of the use of their work in AI models could be costly, difficult to monitor and time-consuming for artists and creatives, they argue.
However, the consultation will also alarm parts of the tech sector, given the plans include AI firms having to be more transparent on the data they use to train models and on how the content they then generate is labelled.
The UK government said on Tuesday that tech companies could be “required to provide more information about what content they have used to train their models . . . to enable rights holders to understand when and how their content has been used in training AI”.
Copyright holders would then be able use this information to more easily strike licensing deals under the plans.
In an interview with the Financial Times, culture minister Sir Chris Bryant said the government would force through transparency over both AI input and output — making clear what a model was trained on, and whether something was produced by AI.
He argued that the system would need to be easy to use by the creative industries.
“This can deliver for the creative industries if we get this right. All these parts are contingent on each other. We want to be able to deliver legal clarity and legal certainty because both sides say that doesn’t exist at the moment,” he said.
Bryant added: “AI companies have said to us very, very clearly that they want to do more business in the UK, but they can’t . . . they’re just so nervous about the legal uncertainty. But it is a quid pro quo. They get that certainty, but only if they can create a system of rights reservation that genuinely works.”
Officials say the consultation will seek opinions on areas such as enforcement, which could include legislation or a regulator to oversee the sector, as well as on what technical systems are needed to make a rights reservation regime work.
They argue that uncertainty about how copyright law functions can make it difficult for creators to control or seek payment for the use of their work, and creates legal risks for AI firms.
Executives in creative industries have concerns about rights reservation given the risk that overseas AI companies will not disclose what material they are using, and not compensate copyright holders if they are discovered to have exploited work.
A previous attempt to agree a voluntary AI copyright code of practice was unsuccessful this year, but Bryant is hopeful that the government can find a balance that benefits both sides.
The government on Tuesday said “further work with both sectors would be needed to ensure any standards and requirements for rights reservation and transparency are effective, accessible, and widely adopted”.
It added: “These measures would be fundamental to the effectiveness of any exception, and we would not introduce an exception without them.”
#forcing #greater #transparency #training #models