Unlock Editor's Digest without spending a dime
FT editor Roula Khalaf selects her favourite stories on this weekly newsletter.
Google is making changes to its search engine within the fight against deepfake pornography because the technology industry grapples with the far-reaching societal impacts of generative artificial intelligence.
Advances in generative artificial intelligence are making fake images more realistic and easier to create, leading experts to warn that using people in pornographic images without their knowledge or consent is becoming increasingly common.
Among the measures introduced by the tech giant on Wednesday are changes that will make it easier for victims of deepfake porn to have videos and pictures of themselves faraway from the web.
Currently, individuals must submit a removal request for every website address or URL. The latest changes would exclude explicit results for related search terms that include an individual's name. The search giant may also downgrade the rankings of internet sites which have received a high variety of removal notices.
“When someone successfully removes a picture from search in accordance with our policies, our systems also seek for and take away any duplicates of that image that we discover,” Google said in a blog post on Wednesday.
Companies like Google, Meta and X have been working to combat deepfakes on their platforms – images, videos and audio files that may be created using artificial intelligence to resemble private and public figures. Last week, Meta's independent Oversight Board called called on the corporate to tighten its rules for removing deepfake porn.
“We are within the midst of a technology shift,” said Emma Higham, a product manager at Google who has been involved in the corporate's fight against deepfakes. “As we monitor our own systems, we've seen a rise within the variety of takedown requests for such a content.”
The issue is now coming into the main focus of regulators: The UK's Online Safety Act, considered considered one of the strictest, passed in October, prohibits the distribution of non-consensual pornographic deepfakes. In the US, laws in several states goal those that create and share explicit deepfake content.
Clare McGlynn, a law professor at Durham University who studies pornography regulations, said the search engine had been slow to maneuver to stop such content from spreading online.
“Google's delay in taking these obvious and vital steps to cut back deepfake sexual abuse is inexcusable,” she said. “Google stays chargeable for the exponential rise in deepfake sexual abuse by highly rating deepfake apps, web sites and tutorials for a few years.”
The company said it has reduced the quantity of explicit deepfake content appearing in its search results by 70 percent because the starting of the yr through initial policy changes and limited updates to its search engine.
However, the corporate explained that the policy updates are subject to limitations. Higham said third-party media providers don’t at all times share video data with Google, making it inconceivable to detect potential duplicates.
For performers of adult content that is just accessible with user consent and who want to share non-consenting content on the search engine, there could also be “trade-offs,” Google added.
Google's changes to its search results rating system would downgrade web sites that link to non-consensual, AI-generated adult content, while promoting non-explicit “high-quality” web sites, including news articles.
“These are unresolved technical challenges for engines like google,” Higham said. “So we're at a degree where we feel like we will get more traction.”
The company has comprehensive policies for removing child sexual abuse material and this yr banned advertisements for deepfake pornography.
Its latest policies don’t include removing popular deepfake sites from search results entirely, a move that advocacy groups like #MyImageMyChoice are calling for.
Google explained that removing sites from the list entirely could block access to vital information, corresponding to how you can remove content from a bunch website.