HomeIndustriesThe UK is working on rules for training AI models with creative...

The UK is working on rules for training AI models with creative work

Stay up thus far with free updates

British ministers are working on plans to extend transparency about how tech corporations train their artificial intelligence models after the creative industry raised concerns that work is being copied and used without permission or a fee.

Culture Secretary Lucy Frazer told the Financial Times that the federal government would make its first try to create rules for AI groups' use of fabric reminiscent of television programmes, books and music.

Frazer said ministers would initially deal with ensuring greater transparency about what content is getting used by AI developers to coach their models, which might actually allow the industry to detect whether the work it produces is being ripped off.

Rishi Sunak's government is caught between competing goals: strengthening the UK's position as a worldwide center for AI and protecting the country's world-leading creative industries sector. A general election expected this yr, with Sunak's Conservatives trailing in opinion polls, can also be more likely to limit the work of ministers and civil servants.

Frazer said in an interview that she recognizes that AI represents a “massive problem not just for journalism, but additionally for the creative industries.”

“The first step is solely to be transparent about what they (AI corporations) are using. (Then) there are other issues which are of great concern to people,” she added. “There are questions on opt-in and opt-out (for content use) and compensation. I work with the industry on all of this stuff.”

Frazer declined to say what mechanisms can be needed to create more transparency so rights holders can understand whether content they produced was used as input for AI models.

Greater transparency around rapidly evolving technology will make it easier for rights holders to prosecute mental property infringements.

People near Labor said the federal government would attempt to recommend proposals before elections expected in the autumn. Asked in regards to the timing, Frazer said she is “working with the industry on all of this stuff.”

Executives and artists within the music, film and publishing industries fear their work is being unfairly used to coach AI models developed by tech corporations.

Last week, Sony Music asked greater than 700 developers to reveal all sources of their AI systems. In a strongly worded letter, the world's second largest music company underlined its exit from its music related to the training, development or commercialization of AI systems.

The EU is already preparing to introduce similar rules under its AI law, which can require developers of general AI models to publish a “sufficiently detailed” summary of the content used for training and implement a directive to comply with the bloc's copyright law.

In contrast, the United Kingdom has been slow to develop similar rules. Officials have acknowledged a conflict between the department's ambitions to draw fast-growing AI corporations to the UK with a lenient regulatory environment and ensuring corporations within the creative industries should not exploited.

An try to create a voluntary algorithm between rights holders and AI developers failed last yr, forcing authorities to rethink their next steps.

Frazer said the federal government desired to create a “framework or policy” around transparency, but noted that “very complex international issues are moving quickly.” She said the UK needed to make sure it had “a really dynamic regulatory environment”.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read